Asynchronous and Periodic tasks with Django using Celery and Redis

Asynchronous and Periodic tasks with Django using Celery and Redis

Image

Source: Celery

Hey readers, today I will be talking about running asynchronous with Django using Celery and Redis as our message broker.

Background story

So let's assume you have some time-consuming I/O operations that you can't afford for the user to wait till it finishes running before you load a page? For example, sending an email, updating the database with user's information not relevant to the user(for example analytics relevant only to you), etc. These are operations that you can't afford to delay the user with, that's where Asynchronous tasking comes to the rescue. Without much work, let's get started.

Notes

  • This tutorial assumes you have a working Django project already, if not, head here to create your django project.
  • We will be using celeryapp as the name of your django project and testapp as our app name

Inits

Install required dependencies
  • Redis You need to install the Redis server.

If you are on Mac, run this else head on to the official redis installation guide here

brew install redis
brew services start redis

Confirm that redis server is active by running

redis-cli ping

Pong means you are good to go.

  • Pip dependencies
pip install celery redis
Setting up your django app with celery
  • Create a file celery.py in the same directory where your celeryapp/settings.py is
# celeryapp/celery.py

import os
from celery import Celery
from django.conf import settings


os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'celeryapp.settings')

# Celery app
app = Celery('celeryapp')

app.config_from_object('django.conf:settings')

# Autodiscover <app>.tasks.py
app.autodiscover_tasks()


# This is a debug task
@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))
  • Open init.py located in your celeryapp folder
# celeryapp/__init__.py

from .celery import app as tasker
__all__ = ('tasker',)
...

# Celery

BROKER_URL = 'redis://localhost:6379' # Your redis default URL
CELERY_RESULT_BACKEND = 'redis://localhost:6379' # Same as up
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Africa/Lagos' # Your timezone
...
  • Then run
celery -A celeryapp worker -l info

If all goes well, you should see our debug_task

...

[tasks]
  celeryapp.celery.debug_task
...

Then shut down celery using [CMD]+C or [CTRL]+C

Celery tasks

Celery uses a normal python function with a celery task decorator

Create a tasks.py file in your testapp folder

# testapp/tasks.py

from celeryapp import tasker

@tasker.task(bind=True)
def send_email(self, email):
    print(email)
    print('## sending Email')

Running tasks

You have three ways of running your task. You can read more here

Method 1
send_email.delay('myemail@email.com')
Method 2
send_email.apply_sync(args = ['myemail@email.com'], kwargs = {})

Read more here on which one to use

Periodic tasks

Scenario: You operates a newsletter that sends emails the first day of every month

Open up your tasks.py

Note: You will need celery-beat to set this up. Follow the guide here

@periodic_task(
    run_every=(crontab(0, 0, day_of_month='1')),
    name="send_newsletter",
    ignore_result=True
)
def send_newsletter():
     for user in User.objects.all():
           ...
           ...

BONUS

I wrote a GitHub gist that makes running celery tasks easier and more easy to track. You can find the gist at gist.github.com/sirrobot01/091a9bd4d8441dc5..

Usage

def home(request):

    data = {"result": {
        "a": [1, 2, 3, 4],
        "b": [5, 6, 7, 8]
    }}

    task = TaskRunner('send_email', send_email, args=["myemail@gmail.com"]).run()

    return JsonResponse(data=data)

Conclusion

You can read more about celery on this links

You can also check out the repository used for this tutorial at github.com/sirrobot01/celery-tutorial

Fell free to drop your questions and I will be willing to answer them

Follow me on Twitter @sirrobot01

Thank you.