How to store Celery task results from External Worker

  amazon-dynamodb, c++, celery, python, rabbitmq

It is not clear how Celery actually uses the result back-end once a task is complete. My challenge is that I am using an external c++ worker/consumer that monitors a RabbitMQ queue for tasks, which does work, but I do not know how to communicate back to celery the status or result of that task.

Here is a nice blog talking about how python and c++ can produce/consume work:
https://blog.petrzemek.net/2017/06/25/consuming-and-publishing-celery-tasks-in-cpp-via-amqp/

A python worker knows how to report back the result and use the result back-end configured with Celery. The c++ worker I made based on the blog does not do this.

Can anyone provide help?

  1. (preferably) a c++ code example that consumes a task and reports the result to Celery via RabbitMQ
  2. any documentation about how celery actually invokes the result back-end so I can figure out the code

(I don’t think it matters at this point but my results are stored in dynamodb)

Source: Windows Questions C++

LEAVE A COMMENT