Tue Nov 5 07:32:01 2002
I have a question about timeout in vnc viewer. I am running a simple
application which fetches several web pages from a web server. The
network (between vnc viewer and vnc server) is forced to be lossy with
random packet loss of 15-18% each way.
At about 15% random packet loss, vnc viewer ( the application ) dies and
connection to vnc server is lost. (Side Note: VNC is much more resililent
to lossy network as opposed to any other thin client or fat client).
Netscape as an application would run on vnc server. And we are
assuming it is perfect connection from vnc server to web server.
Vnc server only sends the displays to vnc client.
At certain packet loss %, the vnc client application crashes.
I am really interested in knowing:::
1. When the vnc client keeps on requesting for updates and doesnt see
anything coming from the vnc server, what does the vnc protocol/ client
2. WHat sort of timeout does it have before the application dies?
3. IS the behaviour in this case same for linux vnc client and windows vnc
The answer to this question is not clear from all the documentation I
have read. Any response asap is greatly appreciated.