Introduction
In this tutorial I will be providing a general understanding of why celery message queue's are valuable along with how to utilize celery in conjunction with Redis in a Django application. To demonstrate implementation specifics I will build a minimalistic image processing application that generates thumbnails of images submitted by users.
The following topics will be covered:
- Background on Message Queues with Celery and Redis
- Local Dev Setup with Django, Celery, and Redis
- Creating Image Thumbnails within a Celery Task
- Deploying to an Ubuntu server
The code for this example can be found on GitHub along with installation and set up instructions if you just want to jump right into a functionally complete application, otherwise for the remainder of the article I will taking you through how to build everything from scratch.
Background on Message Queues with Celery and Redis
Celery is a Python based task queuing software package that enables execution of asynchronous computational workloads driven by information contained in messages that are produced in application code (Django in this example) destined for a Celery task queue. Celery can also be used to execute repeatable, period (ie, scheduled), tasks but, that will not be the focus of this article.
Celery is best used in conjunction with a storage solution that is often referred to as a message broker. A common message broker that is used with celery is Redis which is a performant, in memory, key-value data store. Specifically, Redis is used to store messages produced by the application code describing the work to be done in the Celery task queue. Redis also serves as storage of results coming off the celery queues which are then retrieved by consumers of the queue.
Local Dev Setup with Django, Celery, and Redis
I will start off with the hardest part first which is installing Redis.
Installing Redis on Windows
- Download the Redis zip file and unzip in some directory
- Find the file named
redis-server.exe
and double click to launch the server in a command window - Similarly, find another file named redis-cli.exe and double click it to open the program in a separate command window
- Within the command window running the cli client, test to ensure the client can talk to the server by issuing the command
ping
and if all goes well a response ofPONG
should be returned
Installing Redis on Mac OSX / Linux
- Download the Redis tarball file and extract it in some directory
- Run the makefile with
make install
to build the program - Open up a terminal window and run the
redis-server
command - In another terminal window run
redis-cli
- Within the terminal window running the cli client, test to ensure the client can talk to the server by issuing the command
ping
and if all goes well a response ofPONG
should be returned
Install Python Virtual Env and Dependencies
I can now move on to creating a Python3 virtual environment and installing the dependency packages necessary for this project.
To begin I will create a directory to house things called image_parroter then inside it I will create my virtual environment. All commands from here forward will be of the UNIX flavor only but, most if not all will be the same for a windows environment.
$ mkdir image_parroter
$ cd image_parroter
$ python3 -m venv venv
$ source venv/bin/activate
With the virtual environment now activated I can install the Python packages.
(venv) $ pip install Django Celery redis Pillow django-widget-tweaks
(venv) $ pip freeze > requirements.txt
- Pillow is a non-celery related Python package for image processing that I will use later in this tutorial for demonstrating a real world use case for celery tasks.
- Django Widget Tweaks is a Django plugin for providing flexibility in how form inputs are rendered.
Setting Up the Django Project
Moving on, I create a Django project named image_parroter then a Django app named thumbnailer.
(venv) $ django-admin startproject image_parroter
(venv) $ cd image_parroter
(venv) $ python manage.py startapp thumbnailer
At this point the directory structure looks as follows:
$ tree -I venv
.
└── image_parroter
├── image_parroter
│ ├── __init__.py
│ ├── settings.py
│ ├── urls.py
│ └── wsgi.py
├── manage.py
└── thumbnailer
├── __init__.py
├── admin.py
├── apps.py
├── migrations
│ └── __init__.py
├── models.py
├── tests.py
└── views.py
To integrate Celery within this Django project I add a new module image\_parroter/image\_parrroter/celery.py
following conventions described in the Celery docs. Within this new Python module I import the os
package and the Celery
class from the celery package.
The os
module is used to associate a Celery environment variable called DJANGO_SETTINGS_MODULE
with the Django project's settings module. Following that I instantiate an instance of the Celery
class to create the celery_app
instance variable. I then update the Celery application's configuration with settings I will soon place in the Django project's settings file identifiable with a 'CELERY_' prefix. Finally, I tell the newly created celery_app
instance to auto-discover tasks within the project.
The completed celery.py
module is shown below:
# image_parroter/image_parroter/celery.py
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'image_parroter.settings')
celery_app = Celery('image_parroter')
celery_app.config_from_object('django.conf:settings', namespace='CELERY')
celery_app.autodiscover_tasks()
Now over in the project's settings.py
module, at the very bottom, I define a section for celery settings and add the settings you see below. These settings tell Celery to use Redis as the message broker as well as where to connect to it at. They also tell Celery to expect messages to be passed back and forth between the Celery task queues and Redis message broker to be in the mime type of application/json
.
# image_parroter/image_parroter/settings.py
... skipping to the bottom
# celery
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
Next, I need to ensure that the previously created and configured celery application gets injected into the Django application when it is run. This is done by importing the Celery application within the Django project's main \_\_init\_\_.py
script and explicitly registering it as a namespaced symbol within the image\_parroter
Django package.
# image_parroter/image_parroter/__init__.py
from .celery import celery_app
__all__ = ('celery_app',)
I continue to follow the suggested conventions by adding a new module named tasks.py
within the "thumbnailer" application. Inside the tasks.py
module I import the shared_tasks
function decorator and use it to define a celery task function called adding_task
, as shown below.
# image_parroter/thumbnailer/tasks.py
from celery import shared_task
@shared_task
def adding_task(x, y):
return x + y
Lastly, I need to add the thumbnailer app to the list of INSTALLED_APPS
in the image_parroter project's settings.py
module. While I'm in there I should also add the "widget_tweaks" application to be used to control the rendering of the form input I'll use later to allow users to upload files.
# image_parroter/image_parroter/settings.py
... skipping to the INSTALLED_APPS
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'thumbnailer.apps.ThumbnailerConfig',
'widget_tweaks',
]
I can now test things out using a few simple commands across three terminals.
In one terminal I need to have the redis-server
running, like so:
$ redis-server
48621:C 21 May 21:55:23.706 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
48621:C 21 May 21:55:23.707 # Redis version=4.0.8, bits=64, commit=00000000, modified=0, pid=48621, just started
48621:C 21 May 21:55:23.707 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
48621:M 21 May 21:55:23.708 * Increased maximum number of open files to 10032 (it was originally set to 2560).
_._
_.-``__ ''-._
_.-`` `. `_. ''-._ Redis 4.0.8 (00000000/0) 64 bit
.-`` .-```. ```\/ _.,_ ''-._
( ' , .-` | `, ) Running in standalone mode
|`-._`-...-` __...-.``-._|'` _.-'| Port: 6379
| `-._ `._ / _.-' | PID: 48621
`-._ `-._ `-./ _.-' _.-'
|`-._`-._ `-.__.-' _.-'_.-'|
| `-._`-._ _.-'_.-' | http://redis.io
`-._ `-._`-.__.-'_.-' _.-'
|`-._`-._ `-.__.-' _.-'_.-'|
| `-._`-._ _.-'_.-' |
`-._ `-._`-.__.-'_.-' _.-'
`-._ `-.__.-' _.-'
`-._ _.-'
`-.__.-'
48621:M 21 May 21:55:23.712 # Server initialized
48621:M 21 May 21:55:23.712 * Ready to accept connections
In a second terminal, with an active instance of the Python virtual environment installed previously, in the project's root package directory (the same one that contains the manage.py
module) I launch the celery program.
(venv) $ celery worker -A image_parroter --loglevel=info
-------------- [email protected] v4.3.0 (rhubarb)
---- **** -----
--- * *** * -- Darwin-18.5.0-x86_64-i386-64bit 2019-05-22 03:01:38
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: image_parroter:0x110b18eb8
- ** ---------- .> transport: redis://localhost:6379//
- ** ---------- .> results: redis://localhost:6379/
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. thumbnailer.tasks.adding_task
In the third and final terminal, again with the Python virtual environment active, I can launch the Django Python shell and test out my adding_task
, like so:
(venv) $ python manage.py shell
Python 3.6.6 |Anaconda, Inc.| (default, Jun 28 2018, 11:07:29)
>>> from thumbnailer.tasks import adding_task
>>> task = adding_task.delay(2, 5)
>>> print(f"id={task.id}, state={task.state}, status={task.status}")
id=86167f65-1256-497e-b5d9-0819f24e95bc, state=SUCCESS, status=SUCCESS
>>> task.get()
7
Note the use of the .delay(...)
method on the adding_task
object. This is the common way to pass any necessary parameters to the task object being worked with, as well as initiate sending it off to the message broker and task queue. The result of calling the .delay(...)
method is a promise-like return value of the type celery.result.AsyncResult
. This return value holds information such as the id of the task, its execution state, and the status of the task along with the ability to access any results produced by the task via the .get()
method as shown in the example.
Creating Image Thumbnails within a Celery Task
Now that the boiler plate setup to integrate a Redis-backed Celery instance into the Django application is out of the way I can move on to demonstrating some more useful functionality with the previously mentioned thumbnailer application.
Back in the tasks.py
module, I import the Image
class from the PIL
package, then add a new task called make_thumbnails
, which accepts an image file path and a list of 2-tuple width and height dimensions to create thumbnails of.
# image_parroter/thumbnailer/tasks.py
import os
from zipfile import ZipFile
from celery import shared_task
from PIL import Image
from django.conf import settings
@shared_task
def make_thumbnails(file_path, thumbnails=[]):
os.chdir(settings.IMAGES_DIR)
path, file = os.path.split(file_path)
file_name, ext = os.path.splitext(file)
zip_file = f"{file_name}.zip"
results = {'archive_path': f"{settings.MEDIA_URL}images/{zip_file}"}
try:
img = Image.open(file_path)
zipper = ZipFile(zip_file, 'w')
zipper.write(file)
os.remove(file_path)
for w, h in thumbnails:
img_copy = img.copy()
img_copy.thumbnail((w, h))
thumbnail_file = f'{file_name}_{w}x{h}.{ext}'
img_copy.save(thumbnail_file)
zipper.write(thumbnail_file)
os.remove(thumbnail_file)
img.close()
zipper.close()
except IOError as e:
print(e)
return results
The above thumbnail task simply loads the input image file into a Pillow Image instance, then loops over the dimensions list passed to the task creating a thumbnail for each, adding each thumbnail to a zip archive while also cleaning up the intermediate files. A simple dictionary is returned specifying the URL the zip archive of thumbnails can be downloaded from.
With the celery task defined I move on to building out the Django views to serve up a template with a file upload form.
To start I give the Django project a MEDIA_ROOT
location where image files and zip archives can reside (I used this in the example task above) as well as specify the MEDIA_URL
where the content can be served from. In the image\_parroter/settings.py
module, I add the MEDIA_ROOT
, MEDIA_URL
, IMAGES_DIR
settings locations then provide the logic to create these locations if they do not exist.
# image_parroter/settings.py
... skipping down to the static files section
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/2.2/howto/static-files/
STATIC_URL = '/static/'
MEDIA_URL = '/media/'
MEDIA_ROOT = os.path.abspath(os.path.join(BASE_DIR, 'media'))
IMAGES_DIR = os.path.join(MEDIA_ROOT, 'images')
if not os.path.exists(MEDIA_ROOT) or not os.path.exists(IMAGES_DIR):
os.makedirs(IMAGES_DIR)
Inside the thumbnailer/views.py
module, I import the django.views.View
class and use it to create a HomeView
class containing get
and post
methods, as shown below.
The get
method simply returns a home.html template, to be created shortly, and hands it a FileUploadForm
comprising an ImageField
field as seen above the HomeView
class.
The post
method constructs the FileUploadForm
object using the data sent in the request, checks its validity, then if valid it saves the uploaded file to the IMAGES_DIR
and kicks off a make_thumbnails
task while grabbing the task id
and status to pass to the template, or returns the form with its errors to the home.html template.
# thumbnailer/views.py
import os
from celery import current_app
from django import forms
from django.conf import settings
from django.http import JsonResponse
from django.shortcuts import render
from django.views import View
from .tasks import make_thumbnails
class FileUploadForm(forms.Form):
image_file = forms.ImageField(required=True)
class HomeView(View):
def get(self, request):
form = FileUploadForm()
return render(request, 'thumbnailer/home.html', { 'form': form })
def post(self, request):
form = FileUploadForm(request.POST, request.FILES)
context = {}
if form.is_valid():
file_path = os.path.join(settings.IMAGES_DIR, request.FILES['image_file'].name)
with open(file_path, 'wb+') as fp:
for chunk in request.FILES['image_file']:
fp.write(chunk)
task = make_thumbnails.delay(file_path, thumbnails=[(128, 128)])
context['task_id'] = task.id
context['task_status'] = task.status
return render(request, 'thumbnailer/home.html', context)
context['form'] = form
return render(request, 'thumbnailer/home.html', context)
class TaskView(View):
def get(self, request, task_id):
task = current_app.AsyncResult(task_id)
response_data = {'task_status': task.status, 'task_id': task.id}
if task.status == 'SUCCESS':
response_data['results'] = task.get()
return JsonResponse(response_data)
Check out our hands-on, practical guide to learning Git, with best-practices, industry-accepted standards, and included cheat sheet. Stop Googling Git commands and actually learn it!
Below the HomeView
class I have placed a TaskView
class which will be used via an AJAX request to check the status of the make_thumbnails
task. Here you will notice that I have imported the current_app
object from the celery package and used it to retrieve the task's AsyncResult
object associated with the task_id
from the request. I create a response_data
dictionary of the task's status and id, then if the status indicates the task has executed successfully I fetch the results by calling the get()
method of the AsyncResult
object assigning it to the results
key of the response_data
to be returned as JSON to the HTTP requester.
Before I can make the template UI I need to map the above Django views classes to some sensible URLs. I start by adding a urls.py
module inside the thumbnailer application and define the following URLs:
# thumbnailer/urls.py
from django.urls import path
from . import views
urlpatterns = [
path('', views.HomeView.as_view(), name='home'),
path('task/<str:task_id>/', views.TaskView.as_view(), name='task'),
]
Then over in the project's main URL config I need to include the application level URLs as well as make it aware of the media URL, like so:
# image_parroter/urls.py
from django.contrib import admin
from django.urls import path, include
from django.conf import settings
from django.conf.urls.static import static
urlpatterns = [
path('admin/', admin.site.urls),
path('', include('thumbnailer.urls')),
] + static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
Next I begin building out a simple template view for a user to submit an image file as well as to check the status of the submitted make_thumbnails
tasks and initiate a download of the resulting thumbnails. To start off, I need to create a directory to house this single template within the thumbnailer directory, as follows:
(venv) $ mkdir -p thumbnailer/templates/thumbnailer
Then within this templates/thumbnailer directory I add a template named home.html. Inside home.html I start off by loading the "widget_tweaks" template tags, then move on to define the HTML by importing a CSS framework called bulma CSS, as well as a JavaScript library called Axios.js. In the body of the HTML page I provide a title, a placeholder for displaying a results in progress message, and the file upload form.
<!-- templates/thumbnailer/home.html -->
{% load widget_tweaks %}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta http-equiv="X-UA-Compatible" content="ie=edge">
<title>Thumbnailer</title>
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/bulma/0.7.5/css/bulma.min.css">
<script src="https://cdn.jsdelivr.net/npm/vue"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/axios/0.18.0/axios.min.js"></script>
<script defer src="https://use.fontawesome.com/releases/v5.0.7/js/all.js"></script>
</head>
<body>
<nav class="navbar" role="navigation" aria-label="main navigation">
<div class="navbar-brand">
<a class="navbar-item" href="/">
Thumbnailer
</a>
</div>
</nav>
<section class="hero is-primary is-fullheight-with-navbar">
<div class="hero-body">
<div class="container">
<h1 class="title is-size-1 has-text-centered">Thumbnail Generator</h1>
<p class="subtitle has-text-centered" id="progress-title"></p>
<div class="columns is-centered">
<div class="column is-8">
<form action="{% url 'home' %}" method="POST" enctype="multipart/form-data">
{% csrf_token %}
<div class="file is-large has-name">
<label class="file-label">
{{ form.image_file|add_class:"file-input" }}
<span class="file-cta">
<span class="file-icon"><i class="fas fa-upload"></i></span>
<span class="file-label">Browse image</span>
</span>
<span id="file-name" class="file-name"
style="background-color: white; color: black; min-width: 450px;">
</span>
</label>
<input class="button is-link is-large" type="submit" value="Submit">
</div>
</form>
</div>
</div>
</div>
</div>
</section>
<script>
var file = document.getElementById('{{form.image_file.id_for_label}}');
file.onchange = function() {
if(file.files.length > 0) {
document.getElementById('file-name').innerHTML = file.files[0].name;
}
};
</script>
{% if task_id %}
<script>
var taskUrl = "{% url 'task' task_id=task_id %}";
var dots = 1;
var progressTitle = document.getElementById('progress-title');
updateProgressTitle();
var timer = setInterval(function() {
updateProgressTitle();
axios.get(taskUrl)
.then(function(response){
var taskStatus = response.data.task_status
if (taskStatus === 'SUCCESS') {
clearTimer('Check downloads for results');
var url = window.location.protocol + '//' + window.location.host + response.data.results.archive_path;
var a = document.createElement("a");
a.target = '_BLANK';
document.body.appendChild(a);
a.style = "display: none";
a.href = url;
a.download = 'results.zip';
a.click();
document.body.removeChild(a);
} else if (taskStatus === 'FAILURE') {
clearTimer('An error occurred');
}
})
.catch(function(err){
console.log('err', err);
clearTimer('An error occurred');
});
}, 800);
function updateProgressTitle() {
dots++;
if (dots > 3) {
dots = 1;
}
progressTitle.innerHTML = 'processing images ';
for (var i = 0; i < dots; i++) {
progressTitle.innerHTML += '.';
}
}
function clearTimer(message) {
clearInterval(timer);
progressTitle.innerHTML = message;
}
</script>
{% endif %}
</body>
</html>
At the bottom of the body
element I have added JavaScript to provide some additional behavior. First I create a reference to the file input field and register a change listener, which simply adds the name of the selected file to the UI, once selected.
Next comes the more relevant part. I use Django's template logical if
operator to check for the presence of a task_id
being handed down from the HomeView
class view. This indicates a response after a make_thumbnails
task has been submitted. I then use the Django url
template tag to construct an appropriate task status check URL and begin an interval timed AJAX request to that URL using the Axios
library I mentioned earlier.
If a task status is reported as "SUCCESS" I inject a download link into the DOM and cause it to fire, triggering the download and clearing the interval timer. If the status is a "FAILURE" I simply clear the interval, and if the status is neither "SUCCESS" or "FAILURE" then I do nothing until the next interval is invoked.
At this point I can open yet another terminal, once again with the Python virtual environment active, and start the Django dev server, as show below:
(venv) $ python manage.py runserver
- The
redis-server
and celery task terminals described earlier need to be running also, and if you have not restarted the the Celery worker since adding themake_thumbnails
task you will want toCtrl+C
to stop the worker and then issuecelery worker -A image_parroter --loglevel=info
again to restart it. Celery workers must be restarted each time a celery task-related code change is made.
Now I can load up the home.html view in my browser at http://localhost:8000, submit an image file, and the application should respond with a results.zip
archive containing the original image and a 128x128 pixel thumbnail.
Deploying to an Ubuntu Server
To complete this article I will be demonstrating how to install and configure this Django application that utilizes Redis and Celery for asynchronous background tasks on a Ubuntu v18 LTS server.
Once SSH'd onto the server I update it then install the necessary packages.
# apt-get update
# apt-get install python3-pip python3-dev python3-venv nginx redis-server -y
I also make a user named webapp
, which gives me a home directory to install the Django project at.
# adduser webapp
After inputting user data I then add the webapp
user to the sudo
and the www-data
groups, switch to the webapp
user, then cd
into its home directory.
# usermod -aG sudo webapp
# usermod -aG www-data webapp
$ su webapp
$ cd
Inside the web app directory I can clone the image_parroter GitHub repo, cd
into the repo, create a Python virtual environment, activate it, then install dependencies from the requirements.txt
file.
$ git clone https://github.com/amcquistan/image_parroter.git
$ python3 -m venv venv
$ . venv/bin/activate
(venv) $ pip install -r requirements.txt
In addition to the requirements I just installed I want to add a new one for the uWSGI web application container that will serve the Django application.
(venv) $ pip install uWSGI
Before moving on any further it would be a good time to update the settings.py
file to flip the DEBUG value to False and add the IP address to the list of ALLOWED_HOSTS
.
After that, move into the Django image_parroter project directory (the one containing the wsgi.py
module) and add a new file for holding the uWSGI configuration settings, named uwsgi.ini
, and place the following in it:
# uwsgi.ini
[uwsgi]
chdir=/home/webapp/image_parroter/image_parroter
module=image_parroter.wsgi:application
master=True
processes=4
harakiri=20
socket=/home/webapp/image_parroter/image_parroter/image_parroter/webapp.sock
chmod-socket=660
vacuum=True
logto=/var/log/uwsgi/uwsgi.log
die-on-term=True
Before I forget I should go ahead and add the logging directory and give it appropriate permissions and ownership.
(venv) $ sudo mkdir /var/log/uwsgi
(venv) $ sudo chown webapp:www-data /var/log/uwsgi
Next I make a systemd
service file to manage the uWSGI application server, which is located at /etc/systemd/system/uwsgi.service
and contains the following:
# uwsgi.service
[Unit]
Description=uWSGI Python container server
After=network.target
[Service]
User=webapp
Group=www-data
WorkingDirectory=/home/webapp/image_parroter/image_parroter
Environment="/home/webapp/image_parroter/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin"
ExecStart=/home/webapp/image_parroter/venv/bin/uwsgi --ini image_parroter/uwsgi.ini
[Install]
WantedBy=multi-user.target
Now I can start the uWSGI service, check that its status is ok, and enable it so it starts automatically at boot.
(venv) $ sudo systemctl start uwsgi.service
(venv) $ sudo systemctl status uwsgi.service
(venv) $ sudo systemctl enable uwsgi.service
At this point the Django application and uWSGI service is set up and I can move on to configuring the redis-server
.
I personally prefer to use the systemd
services, so I will edit the /etc/redis/redis.conf
config file by setting the supervised
parameter equal to systemd
. After that I restart redis-server
, check its status, and enable it to start at boot.
(venv) $ sudo systemctl restart redis-server
(venv) $ sudo systemctl status redis-server
(venv) $ sudo systemctl enable redis-server
Next up is to configure celery. I begin this process by creating a logging location for Celery and give this location appropriate permissions and ownership, like so:
(venv) $ sudo mkdir /var/log/celery
(venv) $ sudo chown webapp:www-data /var/log/celery
Following that I add a Celery configuration file, named celery.conf
, in the same directory as the uwsgi.ini
file described earlier, placing the following in it:
# celery.conf
CELERYD_NODES="worker1 worker2"
CELERY_BIN="/home/webapp/image_parroter/venv/bin/celery"
CELERY_APP="image_parroter"
CELERYD_MULTI="multi"
CELERYD_PID_FILE="/home/webapp/image_parroter/image_parroter/image_parroter/%n.pid"
CELERYD_LOG_FILE="/var/log/celery/%n%I.log"
CELERYD_LOG_LEVEL="INFO"
To finish up configuring celery I add its own systemd
service file at /etc/systemd/system/celery.service
and place the following in it:
# celery.service
[Unit]
Description=Celery Service
After=network.target
[Service]
Type=forking
User=webapp
Group=webapp
EnvironmentFile=/home/webapp/image_parroter/image_parroter/image_parroter/celery.conf
WorkingDirectory=/home/webapp/image_parroter/image_parroter
ExecStart=/bin/sh -c '${CELERY_BIN} multi start ${CELERYD_NODES} \
-A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \
--logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL}'
ExecStop=/bin/sh -c '${CELERY_BIN} multi stopwait ${CELERYD_NODES} \
--pidfile=${CELERYD_PID_FILE}'
ExecReload=/bin/sh -c '${CELERY_BIN} multi restart ${CELERYD_NODES} \
-A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \
--logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL}'
[Install]
WantedBy=multi-user.target
The final thing to do is configure nginx to work as a reverse proxy for the uwsgi/django application, as well as serve up the content in the media directory. I do this by adding a nginx config at /etc/nginx/sites-available/image_parroter
, which contains the following:
server {
listen 80;
server_name _;
location /favicon.ico { access_log off; log_not_found off; }
location /media/ {
root /home/webapp/image_parroter/image_parroter;
}
location / {
include uwsgi_params;
uwsgi_pass unix:/home/webapp/image_parroter/image_parroter/image_parroter/webapp.sock;
}
}
Next up I remove the default nginx config allowing me to use server_name _;
to catch all http traffic on port 80, then I create a symbolic link between the config I just added in the "sites-available" directory to the "sites-enabled" directory adjacent to it.
$ sudo rm /etc/nginx/sites-enabled/default
$ sudo ln -s /etc/nginx/sites-available/image_parroter /etc/nginx/sites-enabled/image_parroter
Once that is done I can restart nginx, check its status, and enable it to start at boot.
$ sudo systemctl restart nginx
$ sudo systemctl status nginx
$ sudo systemctl enable nginx
At this point I can point my browser to the IP address of this Ubuntu server and test out the thumbnailer application.
Conclusion
This article described why to use, as well as how to use, Celery for the common purpose of kicking off an asynchronous task, which goes off and runs serially to completion. This will lead to a significant improvement in user experience, reducing the impact of long-running code paths that block the web application server from handling further requests.
I have done my best to provide a detailed explanation of the start to finish process from setting a development environment, implementing celery tasks, producing tasks in Django application code, as well as consuming results via Django and some simple JavaScript.
Thanks for reading and as always don't be shy about commenting or critiquing below.