-
I'm currently working on a Python/django project that uses redis as cache. I'd like to deploy a worker environment to offload a few intensive/async tasks. The current description of the worker service seems to imply that SQS is set up automatically and that they listen exclusively from SNS. Can I start a celery service as a worker that uses Redis as the MQ or can events be pushed directly to SQS? Can a worker service run with any command (eg. celery) with zero queues? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Yes, you can use Celery with Redis as the message broker for your Django project. However, the worker environment in AWS Elastic Beanstalk is designed to work with SQS and SNS, so you would need to set up your own Redis instance and message queue. You can use the Celery command celery worker to start a worker service without any queues and configure it to listen to a Redis queue using the --queues flag. |
Beta Was this translation helpful? Give feedback.
Yes, you can use Celery with Redis as the message broker for your Django project. However, the worker environment in AWS Elastic Beanstalk is designed to work with SQS and SNS, so you would need to set up your own Redis instance and message queue. You can use the Celery command celery worker to start a worker service without any queues and configure it to listen to a Redis queue using the --queues flag.