Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batching action execution #113

Open
Julusian opened this issue Feb 15, 2025 · 2 comments
Open

Batching action execution #113

Julusian opened this issue Feb 15, 2025 · 2 comments

Comments

@Julusian
Copy link
Member

Maybe there should be a way to 'batch' the execution of actions. This would allow modules to break down complex actions into multiple actions, while still executing them in the same message to the device.

This would be performed by companion sending a new executeActionBatch message, which would be similar to executeAction, but would take an array of actions instead.

For simplicity, we should define that a batch should be assumed to be a series of actions on one button with the same start time of execution. (ie, in a concurrent group, with no waits between them.)
If different buttons/triggers are executing actions at the same time, each source will be a different batch.

In the future we could explore handling sequential groups, but that is more complex and wouldnt benefit from the same execution justification.

I would propose for the api, some new method for modules to implement on the module class:

// TBatchContext will be a new generic argument to the class

beginActionBatch?: () => TBatchContext

// If this is not defined and beginActionBatch is, that should throw an error 
executeActionBatch?: (batch: TBatchContext) => Promise<void>

The action definition would then be updated to:

callback: (action: CompanionActionEvent, context: CompanionActionContext, batch: TBatchContext | undefined) => Promise<void> | void

or should this be a new property on CompanionActionContext?

Use case

In bmd-atem, there would be a few benefits to this;

  1. The meTransitionSelection is a bitmask. So toggling one value in it means sending the whole value to the atem.
    This means that if the user toggles multiple using separate actions at the same time each value sent replaces the previous.
    Today we are ensuring predictable behaviour by our own debounce on the sending, resulting in queuing and delaying the sending of values. This could result in timing/order issues that the user doesnt expect

  2. Some actions are written with 20+ properties because that translates to a single command in the protocol. This ensures we don't send many large commands immediately following each other, and also ensures that all the changes to one feature apply as a single operation, rather than staggered.
    With batching, this could be done over multiple actions, with the batcher combining the operations back together

  3. In the protocol it is possible to send multiple commands in one packet; doing this ensures that they will be executed in the same frame.
    Batching would allow us to utilise this for the batch.
    This was actually a problem I saw at work in other software talking to an atem, where we would flood the atem with messages resulting in a lot of retransmits and network usage spikes because of similar 1 command per packet behaviour.

@dnmeid
Copy link
Member

dnmeid commented Feb 15, 2025

Do I get you right, whenever Companion sees a bunch of actions from the same connection at the same time, it just adds the batch to the callback parameters and the module now knows at least it will get some more actions immediately.
So when the module doesn't implement that, nothing will change. When the module implements batch support for some actions, it is up to the module how to handle that, e.g. collect some commands and send them combined when the batch is over.

I want to throw in that there should be some measures to recover from faulty communication. E.g. what happens when the last action of a batch isn't transmitted correctly to the module? Will the batch stay unprocessed forever? It seems to me that this batch support would make our communication with the module stateful.

@Julusian
Copy link
Member Author

Julusian commented Feb 15, 2025

Do I get you right, whenever Companion sees a bunch of actions from the same connection at the same time, it just adds the batch to the callback parameters and the module now knows at least it will get some more actions immediately.

Pretty much yes.

Some very rough code for how this would be implemented:

function executeActions (actions) {
  if (actions.length > 1 && typeof self.beginActionBatch === 'function') {
    const batcher = self.beginActionBatch()
    for (const action of actions) {
      try {
         actionDefinition.callback(action, context, batcher)
      } catch (e) {
        self.log(.....)
      }
    }

    self.executeActionBatch(batcher)
  } else {
    for (const action of actions) {
      actionDefinition.callback(action, context, undefined)
    }
  }
}

An action callback can then vary its behaviour if it was given a batch, how it does that it entirely up to the module. beginActionBatch could return a class which each action then mutates, or just a number and track the state elsewhere.

So when the module doesn't implement that, nothing will change. When the module implements batch support for some actions, it is up to the module how to handle that, e.g. collect some commands and send them combined when the batch is over.

Yes. If the module doesn't implement beginActionBatch then nothing will change. Even if they do implement that but ignore the batch paremeter in an action callback, it will still execute fine just wont be properly batched (which could cause some ordering issues for the user)

I want to throw in that there should be some measures to recover from faulty communication. E.g. what happens when the last action of a batch isn't transmitted correctly to the module? Will the batch stay unprocessed forever? It seems to me that this batch support would make our communication with the module stateful.

I would say that all of this is out of scope of what I am proposing to add. It is up to the module to handle the actual sending inside executeActionBatch, all we are doing is helping the module be aware that it could send these as a batch.
It won't make things any more stateful, comapnion will be doing the same kind of fire and await as today.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

No branches or pull requests

2 participants