Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jetty 12 - Review Jetty WebSocket API batching #9723

Open
sbordet opened this issue May 2, 2023 · 7 comments
Open

Jetty 12 - Review Jetty WebSocket API batching #9723

sbordet opened this issue May 2, 2023 · 7 comments

Comments

@sbordet
Copy link
Contributor

sbordet commented May 2, 2023

Jetty version(s)
12+

Enhancement Description
After #9552, the batching in sends is hardcoded to disabled.

Is it worth to reintroduce it in the API?

There can be 3 proposals:

// Proposal 1 - Similar to WebSocket Core: boolean parameter.
session.send[Partial]Binary(ByteBuffer payload, boolean last, Callback callback, boolean batch); 

// Proposal 2 - boolean field on stateful Session similar to Jetty 11's Session.RemoteEndpoint.
session.setBatch(true);
session.send...(payload, last, callback);

// Proposal 3 - scoped batching
session.batch(() -> {
  session.send...(payload, last, callback);
  session.send...(payload, last, callback);
});

Proposal 2 & 3 are easier to deprecate if turns out to be a not so good idea, because we only deprecate 1 method rather than 4.

@sbordet
Copy link
Contributor Author

sbordet commented May 2, 2023

@gregw @joakime @lachlan-roberts thoughts?

@gregw
Copy link
Contributor

gregw commented May 3, 2023

I'm not a huge fan of features that don't have known use-case, as it is hard to get it right unless there is something using it.

I think that if we do add a mechanism, not calling it batching may help describe it. Perhaps call it autoFlush, or lazyFlush?

Or maybe we just provide a gather send?

but my preference is to wait until somebody needs it.

@sbordet
Copy link
Contributor Author

sbordet commented May 3, 2023

Agreed, closing for now.

@sbordet sbordet closed this as completed May 3, 2023
@brettwooldridge
Copy link

In Jetty 9 I used the following annotation:

@WebSocket(batchMode = BatchMode.OFF, maxBinaryMessageSize = 1024 * 1024 * 4, maxIdleTime = Integer.MAX_VALUE)

With respect to batching, does this concept still exist? Or is it not something I need to be concerned with?

@joakime
Copy link
Contributor

joakime commented Jan 19, 2024

@brettwooldridge can you explain why do you use that feature? What you think it does?

@joakime joakime reopened this Jan 19, 2024
@brettwooldridge
Copy link

brettwooldridge commented Jan 19, 2024

My understanding, which could be mistaken, is that if batch mode was enabled in Jetty 9, that write operations performed on the websocket could result in bytes not being flushed until a threshold number of bytes has been reached, rather than immediately. This appears to be embodied in this code:

Because the code in question was implemented some 7 years ago, I cannot recall whether we encountered a demonstrable issue or whether it was merely a concern of a developer. Because the websocket is used for realtime event notification, the issue or concern is that event delivery might be delayed by batching behavior.

@lachlan-roberts
Copy link
Contributor

@brettwooldridge the batching of frames is still a concept in Jettys lower level websocket-core API. But there is just no configuration for it in the new jetty-websocket-api, it is always off and there is no option to enable it.

So you will not need this batchMode configuration anymore as it is always off.
@WebSocket(batchMode = BatchMode.OFF

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants