-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Assync Task with celery #173
Comments
You could catch the completion of the celery task using an intermediate message catch event. The zeebe process would look like this: Here is what the zeebe worker would look like: worker = ZeebeWorker()
@worker.task("start-task")
def start_task():
# Call the celery task
celery.send_task("tasks.my_task") And the celery task: app = Celery()
zeebe_client = ZeebeClient()
@app.task
def my_task():
do_stuff()
zeebe_client.publish_message("message-name", "correlation-key") You can read more about zeebe messages here. |
@JonatanMartens, |
@JonatanMartens thansks again!! |
@JonatanMartens hello, ` |
@asrocha That's not how you'd typically run Zeebe workers, I would say. Can you give an example of a BPMN used for this? In general, you would have one worker, that handled many different tasks. If different logic is to be performed, you could control that through the variable provided. E.g. building on the quickstart example: @worker.task(task_type="my_task")
def my_task(x: int, operation="addition"):
if operation == "addition":
return {"y": x + 1}
elif operation == "subtraction":
return {"y": x + 1}
else:
res = request.get("https://httpbin.org/get").json()
return {"http-result": res} The |
I'm closing this issue, if you have any more questions feel free to start a new discussion |
Many python applications (django/Falsk and others ) run tasks on Background (assync) using celery.
This architecture is very useful and good to long time running tasks.
I would like to take advantage of all the control (logs, monitoring, etc.) already existing in the celery tasks code, so that the worker is the orchestrator to send status(job/task started, job/task finished) to zeebe. (maybe a integration with Kafka)
My bet is to use ZeebeTaskRouter to route tasks to celery and using a kind of callback function to set job status to worker or using kafka.
I 'm oppening this issue to discuss the subject and take opnions about how to code it and so create a pull request.
Thanks
The text was updated successfully, but these errors were encountered: