Python + Celery + Flask: Dockerizing our solution

Hey everyone! This is the part III of our series "Asynchronous tasks with Python + Flask + Celery + Docker". In this post, we are going to learn how to run our simple application in a docker container.

1.First creating WSGI server configuration

We are not going to run the Flask's built-in server directly, since Flask’s built-in server is not suitable for production. We are going to use uWSGI. We only need to add 2 files:

app/src/wsgi.ini

[uwsgi]
module = app.src.wsgi:app
master = true
processes = 5
http = :$(WSGI_HOST)
die-on-term = true
http-socket = :$(PORT)
die-on-term = true

(we are going to configure PORT and WSGI_HOST as environment variables)

app/src/wsgi.py

from app.src.app import app
import os

if __name__ == "__main__":
    port = int(os.environ.get('PORT', 5000))
    app.run(host=os.environ.get('WSGI_HOST ', 'localhost:5000'), port=port)

2.Create the requirements file

Let's add flask, celery, uwsgi and redis dependencies

requirements.txt

flask==1.1.2
celery==5.0.5
redis==3.5.3
uwsgi==2.0.19.1

3.Create the Dockerfile

This Dockerfile is quite simple. We are using the python:3.9-slim-buster image, that is a light and reliable option. We are adding a line to install gcc, needed to build uWSGI, and installing our requirements.

Dockerfile

FROM python:3.9-slim-buster
RUN apt-get update \
&& apt-get install gcc -y \
&& apt-get clean
WORKDIR /
COPY / /
COPY ./requirements.txt .
RUN pip install -r /requirements.txt

EXPOSE 5000

4.Create a docker-compose

We need to configure 3 services: web, that contains our application; redis, that we use as a message broker and worker, that contains our celery worker:

docker-compose.yml

version: "3.4"
services:
  web:
    container_name: main-app
    build:
      context: "."
    command: uwsgi --ini app/src/wsgi.ini
    ports:
      - "5000:5000"
    environment:
      - REDIS_URL=redis://redis:6379
      - WSGI_HOST=0.0.0.0
      - PORT=5000

  redis:
    image: "redis:6.0.5-buster"
    volumes:
      - "redis:/data"

  worker:
    build:
      context: "."
    command: celery -A app.src.worker worker -l info
    depends_on:
      - "redis"
    environment:
      - REDIS_URL=redis://redis:6379

volumes:
  redis: {}

5. And that's all!

Now, we only need to run our docker-compose to see our application running:

docker-compose build && docker-compose up -d

Testing again:

 curl -X POST http://localhost:5000/send_email
>> {"message":"Email sent."}

This is the very basic tutorial and I hope it was useful! There is much more to explore, both in functionality and configuration, so this was only a lightweight introduction.

The resulting code is here.