Categories
django

Sending Emails Asynchronously with Django-Celery-Email and RabbitMQ

Update 2022: I think using celery directly as opposed to having djcelery abstract the implementation away is better. Check the celery docs then look at how to implement celery with django

I am using django-registration-redux to register and activate users on my website. It sends emails to users to activate and reset their passwords.
Unfortunately when sending emails it waits for a successful email to send – as part of the request and blocks the main thread by default so requests takes quite a while.
To fix this we want all email tasks to be added to a queue to be sent later and immeditately respond to the frontend request.
To do this we will make use of django-celery-email

You can use celery as an interface to your task queue for any python task (tasks you want to do asynchronously). Learn more about celery standalone basics at that link.

Celery requires a message broker to send and receive messages, so you have a choice of what the actual technology backing the queue will be:

  • rabbitmq
  • redis
  • AmazonSQS

We will use rabbitmq. Installing this on ubuntu right off the bat looks looks tricky. You need erlang of a specific version and to add an apt-repositories to get a latest version. I’m going to keep my life simple and:

sudo apt install rabbitmq-server

Apparently if you do this on ubuntu 16.04, rabbit-mq is starts and gets enabled:

Starting rabbitmq-server: SUCCESS

Creating the Celery File

Create a celery.py file in your project root. If you are using a special settings/config location you can put it there next to wsgi.py.

import os
from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')

app = Celery('mysite')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

This code is taken from SimpleIsBetterthanComplex RabbitMQ and Celery Setup.

You then need to ensure that celery is imported when django starts in the projects __init__.py:

from .celery import app as celery_app

__all__ = ['celery_app']

Install Django Celery Email

pip install django-celery-email

This will also install celery

This will install the django-celery-email package along with the celery dependencies. Then add this to your settings file: settings/staging.py:

# Celery email sending

CELERY_BROKER_URL = 'amqp://localhost'

INSTALLED_APPS += (
    'djcelery_email',
    'django_celery_results'
)

CELERY_EMAIL_TASK_CONFIG = {
    'name': 'djcelery_email_send',
    'ignore_result': False,
}

CELERY_RESULT_BACKEND = 'django-db'

EMAIL_BACKEND = 'djcelery_email.backends.CeleryEmailBackend'

If you want the results (AsyncResult) of tasks to be stored in your database, then set the backend: CELERY_RESULT_BACKEND = 'django-db'

Then migrate: ./manage.py migrate

Getting Things to Work

For the messaging queue system to work rabbitmq-server and celery need to be running. rabbitmq-server usually creates a systemd task and starts on boot up. Celery which was installed with pip will have to be daemonized (celery). You can test it out by running it directly: Where config is the project name celery -A config worker -l infoOn a production system with ubuntu 16.04 you can create a daemon:

Create a .celery_env file in the project root:

DJANGO_SETTINGS_MODULE=config.settings.staging

# Name of nodes to start
# here we have a single node
CELERYD_NODES=w1

# Absolute or relative path to the 'celery' command:
CELERY_BIN=/var/www/django_project/env/bin/celery

# App instance to use
# comment out this line if you don't use an app
CELERY_APP=config

# How to call manage.py
CELERYD_MULTI=multi

# Extra command-line arguments to the worker
CELERYD_OPTS=--time-limit=300 --concurrency=8

# - %n will be replaced with the first part of the nodename.
# - %I will be replaced with the current child process index
#   and is important when using the prefork pool to avoid race conditions.
CELERYD_LOG_FILE=/var/www/django_project/log/celery.log
CELERYD_LOG_LEVEL=INFO

Then create the system.d task:

sudo vim /etc/systemd/system/celery.service

With the following content:


[Unit]
Description=celery service
After=network.target

[Service]
Type=forking
User=staging
Group=www-data
EnvironmentFile=-/var/www/kid_hr/.celery_env
WorkingDirectory=/var/www/kid_hr
ExecStart=/bin/sh -c '${CELERY_BIN} multi start ${CELERYD_NODES} \
  -A ${CELERY_APP} \
  --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}'
ExecStop=/bin/sh -c '${CELERY_BIN} multi stopwait ${CELERYD_NODES}'
ExecReload=/bin/sh -c '${CELERY_BIN} multi restart ${CELERYD_NODES} \
  -A ${CELERY_APP} \
  --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}'

[Install]
WantedBy=multi-user.target

Finally we need to ensure that the service starts after reboot:

sudo systemctl enable celery.service

start the service:

sudo systemctl start celery.service

Issues

If things aren\’t working try sending an email from the django-shell then if you get this error:

Task is waiting for execution or unknown.
Any task id that is not known is implied to be in the pending state

You need to ensure the celery service has started, also ensure that you have restarted the entire box. Otherwise check the log files for any issues.

Permission Issues

    PermissionError: [Errno 13] Permission denied: '/var/run/celery'

Sources: