Here I am developing a project with Java sockets, and I have a problem with implementing the correct time synchronization between the server and the client. I will describe the problem with a simple example:
The server runs in GMT and saves a database with various elements. Some of these elements are in the special offer, but these offers have a time limit before the deadline.
So let's say that the server time is now 9:00 (GMT) and there is a proposal item ending at 10:00 (GMT) .
The client may be at a different time and time zone than the server. So, let's say that the client’s time is 8:00 AM (GMT-1) , I can spend the time and set it to the client’s time zone and find that it ends at 9:00 AM (GMT-1) , i.e. After 1 hour.
Problem: . How to calculate the time remaining when the user has a custom time.
For example, the aforementioned client sets the clock manually half an hour ahead, i.e. 8:30 AM (GMT-1) . If you just make a change in the time zone, the item will still end at 9:00 (GMT-1) , so the time remaining until the completion of the offer is incorrect (30 minutes).
We can say that a possible solution would be to ask the client to set the time remaining in seconds from the server, instead of the exact end date. But I want to implement something like counting seconds per second on the client side. (If the sentence ends after 60 seconds, the interface will go 60,59,58, .., 1,0). Therefore, sending a request every second to the server to get the remaining time is inefficient on the network.
Another thing that worries me is that if you go with the “time remaining in seconds” request script, then the response from the server will not come on the slow network instantly, so by the time the client receives the result, it’s already disconnected .