-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebSocket sendBinary() is blocking infinite #11362
Comments
Using Make sure you have a sane value for your connector idle timeout. Your other option is to use the |
Unfortunately, we cannot use the async endpoint due to a bug in Tomcat: https://bz.apache.org/bugzilla/show_bug.cgi?id=56026 Yes, it is right that you must block until completed. But this occur already in the call of sendFrame(). If I understand the code right then sendFrame() is already blocking. If you reach the line b.block() there is nothing to wait. If the callback was not called sendFrame() then you wait for ever. --> dead lock What you means with connector idle timeout? How can connector Idle timeout have any effect on dead lock? |
The connector idle timeout will close the connection. |
Also note, that Jetty 10 is now at End of Community Support. You should be using Jetty 12 at this point in time. |
No, it is not. It takes a callback and notifies the callback when finished. That is why we need to block after the call to |
@sbordet If I debug in the code then the callback is ever call synchron. This means if sendFrame() returns then it was already called. |
@joakime Thanks for the info. I thought that Jetty 10.x was the last version with the old package. I have try to migrate to 12.0.6-ee8 but without luck. But this is another problem. |
No, it means that in that particular case it was called synchronously, but it may not be, for example when the flusher is already sending other content, or when the TCP connection is congested, etc. If you have a case where you debug, Do you have the Are you able to reproduce the issue, and take a server dump as explained here: |
You also have this method |
We have migrate to version 12.0.6 now. We have add the option to create an dump from the Jetty server. But I am unclear what the dump will help. We have also set a setAsyncSendTimeout value now. We need to deploy it in production next week. Then we will see the effect. Thanks for all the suggestions. PS: We does not have |
We have migrated to version 12.0.6 and use setAsyncSendTimeout( 3000 ). We have also set a timer to monitor the call which will interrupt it after 5000 milliseconds. In case of problems only approx 5% of the cases the async send timeout is reacting. In 95% our timeout is cancel the thread. We have also isolated that the problem occur only with clients that use Safari on MacOS. The workaround with the timer and a larger thread pool solves our problem with the hanging server and I am closing this ticket. But I think a default of an unlimited send timeout and not working in the specified time period is bad behavior. Thanks for all the suggestions. |
Jetty version(s)
10.0.19
Java version/vendor
17.0.8.1 (amd64) Eclipse Adoptium
OS type/version
Linux 5.15.0-52-generic
Description
Under certain rare conditions, a call to RemoteEndpoint.Basic.sendBinary(ByteBuffer) may infinitely block. Refer to the stack trace provided below. This issue typically occurs around lunchtime. I suspect that some clients or browsers with open WebSocket connections may go into standby or become unresponsive. Only a server restart resolves the problem because there are no available threads to send WebSocket events.
The problem seems to me the follow code snippet in JavaxWebSocketBasicRemote.sendBinary
The code seems synchron for me. After the call of sendFrame() there occur nothing. The block is only calling for throwing a possible exception. If the callback was not called then it block infinite. A possible fix can be to use
block(1, TimeUnit.MILLISECONDS)
or equals. Of course the better fix would be to find the point where the callback was not called.threaddump22.txt
threaddump21.txt
The text was updated successfully, but these errors were encountered: