You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
When creating a WebSocketClient and not connecting it immediately, the lost-connection-detection doesn't work properly and the client fails with a missing pong. This only occurs when connecting the client after a delay. At most one "round of pings" is sent before the client closes the connection.
Java versions tried: OpenJDK 8.0.242 and OpenJDK 11.0.6.10 (neither worked)
Operating System and version: Windows 10 Enterprise 20H2
Additional context
In the Reproduce-Example I set the ConnectionLostTimeout to a lower value than the default 60 seconds so I don't have to wait every time I test it. I think this also works with the default, but then the Timer-Delay would have to be bigger.
The text was updated successfully, but these errors were encountered:
Describe the bug
When creating a WebSocketClient and not connecting it immediately, the lost-connection-detection doesn't work properly and the client fails with a missing pong. This only occurs when connecting the client after a delay. At most one "round of pings" is sent before the client closes the connection.
To Reproduce
Steps to reproduce the behavior:
Server
Client
Main-Method
Expected behavior
I expect the client to not fail in this case :)
Debug log
LOG
Environment:
Additional context
In the Reproduce-Example I set the ConnectionLostTimeout to a lower value than the default 60 seconds so I don't have to wait every time I test it. I think this also works with the default, but then the Timer-Delay would have to be bigger.
The text was updated successfully, but these errors were encountered: