rsc "at" vingmed.no
Fri, 27 Feb 1998 09:45:54 +0000
> Compression is a good idea, but how long did it take to gzip that 5.5MB
> file? Server performance is going to suffer. Just a thought...
Does it matter that much if you're on a slow link (say 28.8 kbps)
anyway? I see it as a trade-off worth the while.
Summary of some more tests (compressing a 1600x1200 24-bit typical
desktop dump, 5.5 MB, on a P2/300):
PKZIP 1.9 seconds, 86 KB
PKZIP -ex 5.2 seconds, 82 KB
zip 1.5 seconds(!), 99 KB
zip -9 10.1 seconds, 82 KB
Transferring 82 KB over a 28.8 kbps link would *ideally* take at least
23 seconds. The transfer and compression can run as separate threads
(like a pipe), so the transfer time would still be dominant.
Various degrees of compression ought at least to be available as
adjustable options. Also, I used ZIP only as an example. Lossy JPEG is
also interesting, since you can trade off accuracy (quality) for better
compression ratios (and transfer speeds). If you (the client) don't
care about the client screen looking really "sloppy", JPEG should give
great boosts. I don't have any JPEG software to test with though.
Of course, the *best* solution for Windows would be to hook GDI calls -
like most other remote control products for Windows do - Timbuktu,
NetOp, and probably also pcAnywhere (which I haven't tried), but this is
a very OS specific solution. What I like about VNC is its OS
independence, *especially* the Java client which makes workstation
maintenance possible from everywhere in the universe with an Internet
connection, and it should be kept that way.
Robert Schmidt <rsc "at" vingmed.no>
Software Developer / Vingmed Sound tel +47 67124237 fax +47 67124355
Private tel +47 22606076 WWW http://www.nvg.unit.no/~rsc
jeg lukker et oye og ser halvt
jeg lukker begge og ser alt -- seigmen