-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batching action execution #113
Comments
Do I get you right, whenever Companion sees a bunch of actions from the same connection at the same time, it just adds the batch to the callback parameters and the module now knows at least it will get some more actions immediately. I want to throw in that there should be some measures to recover from faulty communication. E.g. what happens when the last action of a batch isn't transmitted correctly to the module? Will the batch stay unprocessed forever? It seems to me that this batch support would make our communication with the module stateful. |
Pretty much yes. Some very rough code for how this would be implemented:
An action callback can then vary its behaviour if it was given a batch, how it does that it entirely up to the module.
Yes. If the module doesn't implement
I would say that all of this is out of scope of what I am proposing to add. It is up to the module to handle the actual sending inside |
Maybe there should be a way to 'batch' the execution of actions. This would allow modules to break down complex actions into multiple actions, while still executing them in the same message to the device.
This would be performed by companion sending a new
executeActionBatch
message, which would be similar toexecuteAction
, but would take an array of actions instead.For simplicity, we should define that a batch should be assumed to be a series of actions on one button with the same start time of execution. (ie, in a concurrent group, with no waits between them.)
If different buttons/triggers are executing actions at the same time, each source will be a different batch.
In the future we could explore handling sequential groups, but that is more complex and wouldnt benefit from the same execution justification.
I would propose for the api, some new method for modules to implement on the module class:
The action definition would then be updated to:
or should this be a new property on
CompanionActionContext
?Use case
In bmd-atem, there would be a few benefits to this;
The
meTransitionSelection
is a bitmask. So toggling one value in it means sending the whole value to the atem.This means that if the user toggles multiple using separate actions at the same time each value sent replaces the previous.
Today we are ensuring predictable behaviour by our own debounce on the sending, resulting in queuing and delaying the sending of values. This could result in timing/order issues that the user doesnt expect
Some actions are written with 20+ properties because that translates to a single command in the protocol. This ensures we don't send many large commands immediately following each other, and also ensures that all the changes to one feature apply as a single operation, rather than staggered.
With batching, this could be done over multiple actions, with the batcher combining the operations back together
In the protocol it is possible to send multiple commands in one packet; doing this ensures that they will be executed in the same frame.
Batching would allow us to utilise this for the batch.
This was actually a problem I saw at work in other software talking to an atem, where we would flood the atem with messages resulting in a lot of retransmits and network usage spikes because of similar 1 command per packet behaviour.
The text was updated successfully, but these errors were encountered: