Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove asyncio sleep from Worker threads #85

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

jakesprouse
Copy link

A 0.1 second sleep was introduced in #22 to reduce CPU usage due to the RXWorker busy looping. However, this sleep constrains PyTAK from receiving more than 10 messages per seconds. Decreasing the sleep period increases CPU usage again.

This commit removes the unneeded sleep from both RXWorker.run() and Worker.run() and allows PyTAK to keep up with received messages in busy TAK environments without undue CPU usage.

CPU usage is kept down by ensuring that Worker coroutines sleep appropriately:

  • RXWorker.run() was already sleeping in self.reader.readuntil() or self.reader.recv().
  • In TXWorker.run(), we replace self.queue.get_nowait() with self.queue.get(). We address the reason for using get_nowait() by introducing a new method for testing purposes, TXWorker.run_once(), which `TXWorker.run() calls in a loop.

The 0.1s sleep was introduced in [snstac#22](snstac#22)
to reduce CPU usage due to the `RXWorker` busy looping. However, this
sleep constrains PyTAK from receiving more than 10 messages per seconds.
Decreasing the sleep period increases CPU usage again.

This commit removes the unneeded sleep from both `RXWorker.run()` and
`Worker.run().`
* `RXWorker.run()` was already sleeping in `self.reader.readuntil()` or
  `self.reader.recv()`.
* In `TXWorker.run()`, we replace `self.queue.get_nowait()` with
  `self.queue.get()`. We address the reason for using `get_nowait()`
  by introducing a new method for testing purposes,
  `TXWorker.run_once()`, which `TXWorker.run() calls in a loop.
@andrwca
Copy link

andrwca commented Dec 9, 2024

This looks like an important PR to prioritize for scaled use-cases. LGTM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants