Task queues are used as a mechanism to distribute work across threads or machines. The task is appended to the queue that calls task queue. Dedicated worker processes constantly monitor task queues for new work to perform.
Celery communicates via messages, usually using a broker to mediate between clients and workers. To initiate a task the client adds a message to the queue, the broker then delivers that message to a worker.
A Celery system can consist of multiple workers and brokers, giving way to high availability and horizontal scaling.
What does Celery need?
Celery requires a message transport to send and receive messages. The RabbitMQ and Redis broker transports are feature complete, but there’s also support for a myriad of other experimental solutions, including using SQLite for local development.
Celery can run on a single machine, on multiple machines, or even across data centers.
Choosing a Broker (RabbitMQ or Redis or others)
is feature-complete, stable, durable and easy to install. It’s an excellent choice for a production environment. Detailed information about using RabbitMQ with Celery: Using RabbitMQ
is also feature-complete but is more susceptible to data loss in the event of abrupt termination or power failures. Detailed information about using Redis: Using Redis
pip install celery
In this article, we keep everything contained in a single module, but for larger projects, you want to create a dedicated module.
Let’s create the file
from celery import Celery app = Celery('tasks', broker='pyamqp://guest@localhost//') @app.task def add(x, y): return x + y
The first argument to
Celery is the name of the current module. This is only needed so that names can be automatically generated when the tasks are defined in the __main__module.
The second argument is the broker keyword argument, specifying the URL of the message broker you want to use. Here using RabbitMQ (also the default option).
See Choosing a Broker above for more choices – for RabbitMQ you can use
amqp://localhost, or for Redis you can use
You defined a single task, called
add, returning the sum of two numbers.
Running the Celery worker server
celery -A tasks worker --loglevel=info
celery -A tasks worker -l info
Calling the task
To call our task you can use the
>>> from tasks import add >>> add.delay(4, 4)
The task has now been processed by the worker you started earlier. You can verify this by looking at the worker’s console output.
Calling a task returns an
AsyncResult instance. This can be used to check the state of the task, wait for the task to finish, or get its return value (or if the task failed, to get the exception and traceback).
Advanced using Celery, in next article…