Client - server connection
We have a requirement to be able to schedule reports. We have been able to
achieve this with 2 services. One containing the server and another that
contains our polling engine (that polls the database for pending reports)
with the ability to create threads that contain clients.
This works OK for the first report. While a 2nd report is running the system
will throw back an error saying the socket has timed out. It appears the
session for the 1st report (which has completed) is timing out causing the
session for the 2nd report to show a timeout error. (The destroy code for
the 1st report datamodule will execute, then the timeout error will appear
for the 2nd report.) Despite the timeouts being set for 20 to 40 minutes for
a session, the timeout occurs at around 10 minutes at the most.
My understanding of clients is each one has its own session/connection to
the server, even if they are coming from the same application. If I close
one of the clients, the session held by the sever for that client would be
closed after the timeout period. Now the assumption: this should not affect
the 2nd client.
Question:
Have I got my logic correct? If I have, what could be a possible cause of
this problem?
Question:
Can the client end of the socket tell the server end that the session is no
longer needed and should be closed?
Regards
Charles
achieve this with 2 services. One containing the server and another that
contains our polling engine (that polls the database for pending reports)
with the ability to create threads that contain clients.
This works OK for the first report. While a 2nd report is running the system
will throw back an error saying the socket has timed out. It appears the
session for the 1st report (which has completed) is timing out causing the
session for the 2nd report to show a timeout error. (The destroy code for
the 1st report datamodule will execute, then the timeout error will appear
for the 2nd report.) Despite the timeouts being set for 20 to 40 minutes for
a session, the timeout occurs at around 10 minutes at the most.
My understanding of clients is each one has its own session/connection to
the server, even if they are coming from the same application. If I close
one of the clients, the session held by the sever for that client would be
closed after the timeout period. Now the assumption: this should not affect
the 2nd client.
Question:
Have I got my logic correct? If I have, what could be a possible cause of
this problem?
Question:
Can the client end of the socket tell the server end that the session is no
longer needed and should be closed?
Regards
Charles
This discussion has been closed.
Comments
Sorry for not responding to this sooner.
Strange behavior, I do not know why you are seeing that.
The concepts of 'session' and 'socket connection' are separate.
In terms of sockets, the architecture is connectionless. The Client makes a
request over a socket and then waits for a response. If it does not receive
a response in the amount of time specified by
ClientReport.ServerConnection.Timeout, then the request times out. Once a
request/response is complete, the socket connection is close by the client.
(So the client behaves much like a web browser).
In terms of sessions, the Server maintains a list of active client sessions.
It sends the SessionId to the client (see ClientReport.SessionId). A session
'times out' on the server, if the client ceases to make additional
requests.
--
Nard Moseley
Digital Metaphors
www.digital-metaphors.com
Best regards,
Nard Moseley
Digital Metaphors
www.digital-metaphors.com