As you build and scale a Django app you'll inevitably need to run certain tasks periodically and automatically in the background.
Some examples:
- Generating periodic reports
- Clearing cache
- Sending batch e-mail notifications
- Running nightly maintenance jobs
This is one of the few pieces of functionality required for building and scaling a web app that isn't part of the Django core. Fortunately, Celery provides a powerful solution, which is fairly easy to implement called Celery Beat.
In the following article, we'll show you how to set up Django, Celery, and Redis with Docker in order to run a custom Django Admin command periodically with Celery Beat.
Django + Celery Series:
Contents
Objectives
By the end of this article, you should be able to:
- Containerize Django, Celery, and Redis with Docker
- Integrate Celery into a Django app and create tasks
- Write a custom Django Admin command
- Schedule a custom Django Admin command to run periodically via Celery Beat
Project Setup
Clone down the base project from the django-celery-beat repo, and then check out the base branch:
$ git clone https://github.com/testdrivenio/django-celery-beat --branch base --single-branch
$ cd django-celery-beat
Since we'll need to manage four processes in total (Django, Redis, worker, and scheduler), we'll use Docker to simplify our workflow by wiring them up so that they can all be run from one terminal window with a single command.
From the project root, create the images and spin up the Docker containers:
$ docker compose up -d --build
Next, apply the migrations:
$ docker compose exec web python manage.py migrate
Once the build is complete, navigate to http://localhost:1337 to ensure the app works as expected. You should see the following text:
Orders
No orders found!
Take a quick look at the project structure before moving on:
├── .gitignore
├── docker-compose.yml
└── project
├── Dockerfile
├── core
│ ├── __init__.py
│ ├── asgi.py
│ ├── settings.py
│ ├── urls.py
│ └── wsgi.py
├── entrypoint.sh
├── manage.py
├── orders
│ ├── __init__.py
│ ├── admin.py
│ ├── apps.py
│ ├── migrations
│ │ ├── 0001_initial.py
│ │ └── __init__.py
│ ├── models.py
│ ├── tests.py
│ ├── urls.py
│ └── views.py
├── products.json
├── requirements.txt
└── templates
└── orders
└── order_list.html
Want to learn how to build this project? Check out Dockerizing Django with Postgres, Gunicorn, and Nginx.
Celery and Redis
Now, we need to add containers for Celery, Celery Beat, and Redis.
We'll begin by adding the dependencies to the requirements.txt file:
Django==5.0.7
celery==5.4.0
redis==5.0.7
Next, add the following to the end of the docker-compose.yml file:
redis:
image: redis:alpine
celery:
build: ./project
command: celery -A core worker -l info
volumes:
- ./project/:/usr/src/app/
environment:
- DEBUG=1
- SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
- DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
depends_on:
- redis
celery-beat:
build: ./project
command: celery -A core beat -l info
volumes:
- ./project/:/usr/src/app/
environment:
- DEBUG=1
- SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
- DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
depends_on:
- redis
We also need to update the web service's depends_on
section:
web:
build: ./project
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./project/:/usr/src/app/
ports:
- 1337:8000
environment:
- DEBUG=1
- SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
- DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
depends_on:
- redis # NEW
The full docker-compose.yml file should now look like this:
services:
web:
build: ./project
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./project/:/usr/src/app/
ports:
- 1337:8000
environment:
- DEBUG=1
- SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
- DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
depends_on:
- redis
redis:
image: redis:alpine
celery:
build: ./project
command: celery -A core worker -l info
volumes:
- ./project/:/usr/src/app/
environment:
- DEBUG=1
- SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
- DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
depends_on:
- redis
celery-beat:
build: ./project
command: celery -A core beat -l info
volumes:
- ./project/:/usr/src/app/
environment:
- DEBUG=1
- SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
- DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
depends_on:
- redis
Before building the new containers we need to configure Celery in our Django app.
Celery Configuration
Setup
In the "core" directory, create a celery.py file and add the following code:
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "core.settings")
app = Celery("core")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
What's happening here?
- First, we set a default value for the
DJANGO_SETTINGS_MODULE
environment variable so that Celery will know how to find the Django project. - Next, we created a new Celery instance, with the name
core
, and assigned the value to a variable calledapp
. - We then loaded the celery configuration values from the settings object from
django.conf
. We usednamespace="CELERY"
to prevent clashes with other Django settings. All config settings for Celery must be prefixed withCELERY_
, in other words. - Finally,
app.autodiscover_tasks()
tells Celery to look for Celery tasks from applications defined insettings.INSTALLED_APPS
.
Add the following code to core/__init__.py:
from .celery import app as celery_app
__all__ = ("celery_app",)
Lastly, update the core/settings.py file with the following Celery settings so that it can connect to Redis:
CELERY_BROKER_URL = "redis://redis:6379"
CELERY_RESULT_BACKEND = "redis://redis:6379"
Build the new containers to ensure that everything works:
$ docker compose up -d --build
Take a look at the logs for each service to see that they are ready, without errors:
$ docker compose logs 'web'
$ docker compose logs 'celery'
$ docker compose logs 'celery-beat'
$ docker compose logs 'redis'
If all went well, we now have four containers, each with different services.
Now we're ready to create a sample task to see that it works as it should.
Create a Task
Create a new file called core/tasks.py and add the following code for a sample task that just logs to the console:
from celery import shared_task
from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)
@shared_task
def sample_task():
logger.info("The sample task just ran.")
Schedule the Task
At the end of your settings.py file, add the following code to schedule sample_task
to run once per minute, using Celery Beat:
CELERY_BEAT_SCHEDULE = {
"sample_task": {
"task": "core.tasks.sample_task",
"schedule": crontab(minute="*/1"),
},
}
Here, we defined a periodic task using the CELERY_BEAT_SCHEDULE setting. We gave the task a name, sample_task
, and then declared two settings:
task
declares which task to run.schedule
sets the interval on which the task should run. This can be an integer, a timedelta, or a crontab. We used a crontab pattern for our task to tell it to run once every minute. You can find more info on Celery's scheduling here.
Make sure to add the imports:
from celery.schedules import crontab
import core.tasks
Restart the container to pull in the new settings:
$ docker compose up -d --build
Once done, take a look at the celery logs in the container:
$ docker compose logs -f 'celery'
You should see something similar to:
celery_1 | -------------- [queues]
celery_1 | .> celery exchange=celery(direct) key=celery
celery_1 |
celery_1 |
celery_1 | [tasks]
celery_1 | . core.tasks.sample_task
We can see that Celery picked up our sample task, core.tasks.sample_task
.
Every minute you should see a row in the log that ends with "The sample task just ran.":
celery-1 | [2024-07-18 23:05:00,035: INFO/MainProcess] Task core.tasks.sample_task[a2dfb1a7-ea7b-48b0-9684-67633f6cd8a6] received
celery-1 | [2024-07-18 23:05:00,044: INFO/ForkPoolWorker-8] core.tasks.sample_task[a2dfb1a7-ea7b-48b0-9684-67633f6cd8a6]: The sample task just ran.
Custom Django Admin Command
Django provides a number of built-in django-admin
commands, like:
migrate
startproject
startapp
dumpdata
makemigrations
Along with the built-in commands, Django also gives us the option to create our own custom commands:
Custom management commands are especially useful for running standalone scripts or for scripts that are periodically executed from the UNIX crontab or from Windows scheduled tasks control panel.
So, we'll first configure a new command and then use Celery Beat to run it automatically.
Start by creating a new file called orders/management/commands/my_custom_command.py. Then, add the minimal required code for it to run:
from django.core.management.base import BaseCommand, CommandError
class Command(BaseCommand):
help = "A description of the command"
def handle(self, *args, **options):
pass
The BaseCommand
has a few methods that can be overridden, but the only method that's required is handle
. handle
is the entry point for custom commands. In other words, when we run the command, this method is called.
To test, we'd normally just add a quick print statement. However, it's recommended to use stdout.write
instead per the Django documentation:
When you are using management commands and wish to provide console output, you should write to self.stdout and self.stderr, instead of printing to stdout and stderr directly. By using these proxies, it becomes much easier to test your custom command. Note also that you don’t need to end messages with a newline character, it will be added automatically, unless you specify the ending parameter.
So, add a self.stdout.write
command:
from django.core.management.base import BaseCommand, CommandError
class Command(BaseCommand):
help = "A description of the command"
def handle(self, *args, **options):
self.stdout.write("My sample command just ran.") # NEW
To test, from the command line, run:
$ docker compose exec web python manage.py my_custom_command
You should see:
My sample command just ran.
With that, let's tie everything together!
Schedule a Custom Command
Now that we've spun up the containers, tested that we can schedule a task to run periodically, and wrote a custom Django Admin sample command, it's time to configure Celery Beat to run the custom command periodically.
Setup
In the project we have a very basic app called orders. It contains two models, Product
and Order
. Let's create a custom command that sends an email report of the confirmed orders from the day.
To begin with, we'll add a few products and orders to the database via the fixture included in this project:
$ docker compose exec web python manage.py loaddata products.json
Next, add some sample orders via the Django Admin interface. To do so, first create a superuser:
$ docker compose exec web python manage.py createsuperuser
Fill in username, email, and password when prompted. Then navigate to http://127.0.0.1:1337/admin in your web browser. Log in with the superuser you just created and create a couple of orders. Make sure at least one has a confirmed_date
of today.
Let's create a new custom command for our e-mail report.
Create a file called orders/management/commands/email_report.py:
from datetime import timedelta, time, datetime
from django.core.mail import mail_admins
from django.core.management import BaseCommand
from django.utils import timezone
from django.utils.timezone import make_aware
from orders.models import Order
today = timezone.now()
tomorrow = today + timedelta(1)
today_start = make_aware(datetime.combine(today, time()))
today_end = make_aware(datetime.combine(tomorrow, time()))
class Command(BaseCommand):
help = "Send Today's Orders Report to Admins"
def handle(self, *args, **options):
orders = Order.objects.filter(confirmed_date__range=(today_start, today_end))
if orders:
message = ""
for order in orders:
message += f"{order} \n"
subject = (
f"Order Report for {today_start.strftime('%Y-%m-%d')} "
f"to {today_end.strftime('%Y-%m-%d')}"
)
mail_admins(subject=subject, message=message, html_message=None)
self.stdout.write("E-mail Report was sent.")
else:
self.stdout.write("No orders confirmed today.")
In the code, we queried the database for orders with a confirmed_date
of today, combined the orders into a single message for the email body, and used Django's built in mail_admins
command to send the emails to the admins.
Add a dummy admin email and set the EMAIL_BACKEND
to use the Console backend, so the email is sent to stdout, in the settings file:
EMAIL_BACKEND = "django.core.mail.backends.console.EmailBackend"
DEFAULT_FROM_EMAIL = "[email protected]"
ADMINS = [("testuser", "[email protected]"), ]
It should now be possible to run our new command from the terminal.
$ docker compose exec web python manage.py email_report
And the output should look similar to this:
Content-Type: text/plain; charset="utf-8"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Subject: [Django] Order Report for 2024-07-18 to 2024-07-19
From: root@localhost
To: [email protected]
Date: Thu, 18 Jul 2024 23:14:17 -0000
Message-ID: <172134445723.51.2267396455700871586@d635c9578b68>
Order: 728763f1-372c-46d5-a29c-4fbbad622a4e - product: Potatoes
Order: 65e0ceb4-0f72-41fd-8717-3a49382af37c - product: Coffee
Order: 78458758-ab2f-4c89-9020-4ab2abca56c7 - product: Rice
-------------------------------------------------------------------------------
E-mail Report was sent.
Celery Beat
We now need to create a periodic task to run this command daily.
Add a new task to core/tasks.py:
from celery import shared_task
from celery.utils.log import get_task_logger
from django.core.management import call_command # NEW
logger = get_task_logger(__name__)
@shared_task
def sample_task():
logger.info("The sample task just ran.")
# NEW
@shared_task
def send_email_report():
call_command("email_report", )
So, first we added a call_command
import, which is used for programmatically calling django-admin commands. In the new task, we then used the call_command
with the name of our custom command as an argument.
To schedule this task, open the core/settings.py file, and update the CELERY_BEAT_SCHEDULE
setting to include the new task:
CELERY_BEAT_SCHEDULE = {
"sample_task": {
"task": "core.tasks.sample_task",
"schedule": crontab(minute="*/1"),
},
"send_email_report": {
"task": "core.tasks.send_email_report",
"schedule": crontab(hour="*/1"),
},
}
Here we added a new entry to the CELERY_BEAT_SCHEDULE
called send_email_report
. As we did for our previous task, we declared which task it should run -- e.g., core.tasks.send_email_report
-- and used a crontab pattern to set the recurrence.
Restart the containers to make sure the new settings become active:
$ docker compose up -d --build
Open the logs associated with the celery
service:
$ docker compose logs -f 'celery'
You should see the send_email_report
listed:
celery_1 | -------------- [queues]
celery_1 | .> celery exchange=celery(direct) key=celery
celery_1 |
celery_1 |
celery_1 | [tasks]
celery_1 | . core.tasks.sample_task
celery_1 | . core.tasks.send_email_report
A minute or so later you should see that the e-mail report is sent:
celery-1 | [2024-07-18 23:16:00,104: WARNING/ForkPoolWorker-8] Content-Type: text/plain; charset="utf-8"
celery-1 | MIME-Version: 1.0
celery-1 | Content-Transfer-Encoding: 7bit
celery-1 | Subject: [Django] Order Report for 2024-07-18 to 2024-07-19
celery-1 | From: root@localhost
celery-1 | To: [email protected]
celery-1 | Date: Thu, 18 Jul 2024 23:16:00 -0000
celery-1 | Message-ID: <172134456009.17.2261486624363737199@de3d707b59a9>
celery-1 |
celery-1 | Order: 728763f1-372c-46d5-a29c-4fbbad622a4e - product: Potatoes
celery-1 | Order: 65e0ceb4-0f72-41fd-8717-3a49382af37c - product: Coffee
celery-1 | Order: 78458758-ab2f-4c89-9020-4ab2abca56c7 - product: Rice
celery-1 | [2024-07-18 23:16:00,104: WARNING/ForkPoolWorker-8] -------------------------------------------------------------------------------
celery-1 | [2024-07-18 23:16:00,104: WARNING/ForkPoolWorker-8] E-mail Report was sent.
Conclusion
In this article, we guided you through setting up Docker containers for Celery, Celery Beat, and Redis. We then showed how to create a custom Django Admin command and a periodic task with Celery Beat to run that command automatically.
Looking for more?
- Set up Flower to monitor and administer Celery jobs and workers
- Test a Celery task with both unit and integration tests
Grab the code from the repo.
Django + Celery Series: