Sending Emails Asynchronously with Django-Celery-Email and RabbitMQ

I am using django-registration-redux to register and activate users. It sends emails to activate and set password. Unfortunately when sending emails it waits for a successful email to send and blocks the main thread so requests takes quite a while. We want all email to send through celery and don’t want to have to create specific tasks so we make use of django-celery-email

Apparently you can use celery to asynchronously send the messages.

Celery requires a broker to send and receive messages, so you have a choice:

  • rabbitmq
  • redis
  • AmazonSQS

We will use rabbitmq. Installing this on ubuntu right off the bat looks looks tricky. You need erlang of a specific version and to app apt-repositories to get a latest version.

I’m going to keep my life simple and


sudo apt install rabbitmq-server

Apparently if you do this on ubuntu 16.04, rabbit-mq is already running in the background, but I didn’t see this message:


Starting rabbitmq-server: SUCCESS

Creating the Celery File

Create a celery.py file in your project root. If you are using a special settings/config location you can put it there next to wsgi.py.


import os
from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')

app = Celery('mysite')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

This code is taken from SimpleIsBetterthanComplex RabbitMQ and Celery Setup. You then need to ensure that celery is imported when django starts in the projects __init__.py:


from .celery import app as celery_app

__all__ = ['celery_app']

Install Django Celery Email

pip install django-celery-email

This will install the django-celery-email package along with the celery dependencies.

Then add this to your settings file: settings/staging.py:


# Celery email sending

CELERY_BROKER_URL = 'amqp://localhost'

INSTALLED_APPS += (
    'djcelery_email',
    'django_celery_results'
)

CELERY_EMAIL_TASK_CONFIG = {
    'name': 'djcelery_email_send',
    'ignore_result': False,
}

CELERY_RESULT_BACKEND = 'django-db'

EMAIL_BACKEND = 'djcelery_email.backends.CeleryEmailBackend'
Migrate to create the message queue database requirements. I’ve also set the CELERY_RESULT_BACKEND so that you can view the result of an AsyncResult.
Migrate: ./manage.py migrate

Getting Things to Work

For the messaging queue system to work rabbitmq-server and celery need to be running.

rabbitmq-server usually creates a system d task and starts on boot up. Celery which was installed with pip will have to be daemonized (celery).

You can test it out by running it directly:

Where config is the project name

celery -A config worker -l info

On a production system with ubuntu 16.04 you can create a daemon:

Create a .celery_env file in the project root:


DJANGO_SETTINGS_MODULE=config.settings.staging

# Name of nodes to start
# here we have a single node
CELERYD_NODES="w1"

# Absolute or relative path to the 'celery' command:
CELERY_BIN="/var/www/django_project/env/bin/celery"

# App instance to use
# comment out this line if you don't use an app
CELERY_APP="config"

# How to call manage.py
CELERYD_MULTI="multi"

# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=300 --concurrency=8"

# - %n will be replaced with the first part of the nodename.
# - %I will be replaced with the current child process index
#   and is important when using the prefork pool to avoid race conditions.
CELERYD_LOG_FILE="/var/www/django_project/log/celery.log"
CELERYD_LOG_LEVEL="INFO"

Then create the system.d task:

sudo vim /etc/systemd/system/celery.service

With the following content:


[Unit]
Description=celery service
After=network.target

[Service]
Type=forking
User=staging
Group=www-data
EnvironmentFile=-/var/www/kid_hr/.celery_env
WorkingDirectory=/var/www/kid_hr
ExecStart=/bin/sh -c '${CELERY_BIN} multi start ${CELERYD_NODES} \
  -A ${CELERY_APP} \
  --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}'
ExecStop=/bin/sh -c '${CELERY_BIN} multi stopwait ${CELERYD_NODES}'
ExecReload=/bin/sh -c '${CELERY_BIN} multi restart ${CELERYD_NODES} \
  -A ${CELERY_APP} \
  --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} ${CELERYD_OPTS}'

[Install]
WantedBy=multi-user.target

Finally we need to ensure that the service starts after reboot:

sudo systemctl enable celery.service

Issues

If things aren’t working try sending an email from the django-shell then if you get this error:


Task is waiting for execution or unknown. Any task id that is not known is implied to be in the pending state

you need to ensure the celery service has started, also ensure that you have restarted the entire box.

Otherwise check the log files for any issues.

Sources: