Celery 101
Celery is the most popular task queue for the Python programming language.
Table of Contents
- What is a Task Queue?
- What is a Task?
- How to Put a Task into a Queue
- Installing Celery
- How to Setup a Queue
- Storing Results
- Launching Celery Workers
- Additional Resources
What is a Task Queue?
A task queue is simply a list of jobs that need to be performed by an application. Think of a task queue as a list of work that must be done by a team of workers. It all starts when an application puts a task onto a queue. This queue is read by a pool of workers who then execute the task.
One main benefit of using a task queue is the ability to scale a project horizontally by adding more workers. The main application can defer the execution of tasks to a gigantic pool of workers running on different machines.
Another benefit is that it allows the application to start a task in the background and keep track of its progress. This usage allows for improved UX because tasks can be launched by users and they can come back at a later time to see the progress of a task.
What is a Task?
A task is simply a function. It can be as simple as this:
# tasks.py
from celery import Celery
app = Celery('tasks', broker='redis://localhost')
@app.task
def add(x, y):
return x + y
Or, it can be something that is very slow:
# tasks.py
from time import sleep
from celery import Celery
app = Celery('tasks', broker='redis://localhost')
@app.task
def slow_work(seconds):
sleep(seconds)
return result
How to Put a Task into a Queue
Tasks can be started by an application with the following:
# main.py from tasks import add result = add.delay(25, 50) print(result.state)
Installing Celery
Celery can be found on the pypi index, and is easily installed with the following command:
pip install celery redis
How to Setup a Queue
Celery can rely on several different queues, also known as brokers. The 3 supported ones are Redis, RabbitMQ and SQS. We’ll link to instructions on how to manually install these at the bottom of the page.
Setting these brokers up can be time consuming. The fastest way to get started with queue setup is to use CeleryHost’s hosted broker service.
You’ll get a broker connection string that looks like this:
'redis://:abcdef@12345.broker.celeryhost.com/'
Simply use this in your application, like so:
app = Celery(
'tasks',
broker='redis://:abcdef@12345.broker.celeryhost.com/'
)
Storing Results
By default, Celery does not store the results of the tasks. Celery can be configured to work with a variety of databases to store results. (You can find them at the bottom of this page.)
Using CeleryHost’s hosted Redis storage backend is a quick way to get started. You will get a connection string, such as the following:
'redis://:abcdef@12345.broker.celeryhost.com/'
Simply drop this into your application:
app = Celery(
'tasks',
broker='redis://:abcdef@12345.broker.celeryhost.com/',
backend='redis://:abcdef@12345.broker.celeryhost.com/'
)
Launching Workers
After setting up the brokers and result backends, you can launch the workers that will perform the tasks.
Celery workers can be launched with the following:
celery -A tasks worker --loglevel=INFO
You will need to be in the same directory as your tasks.py file.
And that’s it!
CeleryHost aims to take the pain out of Celery infrastructure, so you can focus on your tasks.