PYTHON FLASK CELERY + DOCKER
In this article, we will cover how you can use docker compose to use celery with python flask on a target machine.
Requirements on our end are pretty simple and straightforward.
- Control over configuration
- Setup the flask app
- Setup the rabbitmq server
- Ability to run multiple celery workers
Furthermore we will explore how we can manage our application on docker.
- Inspect status of running containers
- Start or stop the services
- Inspect logs of individual services
Lets Code
We start by first creating our base directory flask-celery.
Within that directory we will create following files and directories.
- requirements.txt
- wokerA.py
- workerB.py
- app.py
- docker-compose.yml
- Dockerfile
Let’s define our first file requirements.txt
These are the python modules which we needed to install for the python flask celery setup.
Flask
amqp
celery
Next, we create our celery workers. First, add the following code in workerA.py
from celery import Celery
# Celery configuration
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq@rabbit:5672/'
CELERY_RESULT_BACKEND = 'rpc://'
# Initialize Celery
celery = Celery('workerA', broker=CELERY_BROKER_URL, backend=CELERY_RESULT_BACKEND)
@celery.task()
def add_nums(a, b):
return a + b
Now, lets add workerB.py
from celery import Celery
# Celery configuration
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq@rabbit:5672/'
CELERY_RESULT_BACKEND = 'rpc://'
# Initialize Celery
celery = Celery('workerB', broker=CELERY_BROKER_URL, backend=CELERY_RESULT_BACKEND)
@celery.task()
def sub_nums(a, b):
return a - b
Now adding api endpoints to our flask application in file app.py
from workerA import add_nums
from workerB import sub_nums
from flask import (
Flask,
request,
jsonify,
)
@app.route("/add")
def add():
first_num = request.args.get(‘first_num’)
second_num = request.args.get(‘second_num’)
result = add_nums.delay(first_num, second_num )
return jsonify({‘result’: result}), 200
@app.route("/subtract")
def subtract():
first_num = request.args.get(‘first_num’)
second_num = request.args.get(‘second_num’)
result = sub_nums.delay(first_num, second_num )
return jsonify({‘result’: result}), 200
Docker require your apps to be enclosed in a container. Lets do that by adding the Dockerfile.
FROM python:3
ADD requirements.txt /app/requirements.txt
WORKDIR /app/
RUN pip install -r requirements.txt
ENTRYPOINT ["python"]
CMD ["./app.py","--host=0.0.0.0"]
Once we have a docker container we can build it using docker build command, but we need rabbitmq to be available for our flask application to work.
Lets solve that by setting up service definitions in docker-compose.yml file.
version: "3"
services:
web:
build:
context: .
dockerfile: Dockerfile
restart: always
ports:
- "5000:5000"
depends_on:
- rabbit
volumes:
- .:/app
rabbit:
hostname: rabbit
image: rabbitmq:management
environment:
- RABBITMQ_DEFAULT_USER=rabbitmq
- RABBITMQ_DEFAULT_PASS=rabbitmq
ports:
- "5673:5672"
- "15672:15672"
worker_1:
build:
context: .
hostname: worker_1
entrypoint: celery
command: -A workerA worker --loglevel=info -Q workerA
volumes:
- .:/app
links:
- rabbit
depends_on:
- rabbit
worker_2:
build:
context: .
hostname: worker_2
entrypoint: celery
command: -A workerB worker --loglevel=info -Q workerB
volumes:
- .:/app
links:
- rabbit
depends_on:
- rabbit
Managing Python flask Services
In this section, we will cover how we can control our docker instance.
Creating Build
We can create build using following simple command.
docker-compose build
Starting Services
We can start services using following simple command.
docker-compose up -d
-d flag instruct docker compose to run services as daemon.
After starting the services. We can access our python flask app server on.
http://localhost:5000
And rabbitmq server on.
http://localhost:15672
Inspecting Services
We can inspect running services using following command.
docker-compose ps
Inspecting Logs
There are many approaches you can take to inspect logs of running services.
We will start by most simple and primitive approach.
docker-compose logs
Above command will dump logs of all the running services, although I have found this command to be seldom useful.
We will now cover how we can inspect individual services logs.
docker-compose logs [service_name] -f --tail=10
In above command we use -f flag to follow logs and --tail to fetch last 10 lines you can always increase this number to your liking.
Interacting with Python flask container
We will use the following command to bind our shell to python flask container.
docker-compose exec -it web /bin/bash
After shell is bound you can run any command within python flask container environment it will be pretty much similar to running a remote shell using ssh.
Stopping containers
At last we will cover how we can stop all the running services.
docker-compose down
If you face any problem following the tutorial you can refer to carbonteq gitlab repository. Simply clone the repository and use managing python flask services section for operations.