PYTHON FLASK CELERY + ELK

PYTHON FLASK CELERY + ELK

In the article we will discuss how to handle logging in a python celery environment with ELK stack.

Requirement on our side is simple.

  • Setup the python flask app Dockerize it.
  • Setup the celery with python flask.
  • Dockerize the celery workers.
  • Dockerize rabbitmq.
  • Dockerize elasticsearch.
  • Integrate celstash.

Furthermore, we will discuss how we can manage our application on docker.

  • Inspect status of running container
  • Start or stop the services
  • Inspect logs of different service

To setup the python flask app, celery with python flask, Dockerize the python flask app with celery. Dockerize the celery workers and start on different containers and Dockerization of rabbitmq. It is all discussed in over PYTHON FLASK CELERY DOCKER article. First we will setup all this. And In this article we will discuss how to Dockerize elasticsearch and Integrate celstash in our app.

Lets Code

We start with the previously created our base directory flask-celery. With in that directory we change our few files and also we will create sub directories.

Files to change are as follows.

  • requirements.txt
  • docker-compose.yml
  • workerA.py
  • workerB.py

Directory to create

  • logstash

In logstash directory create logstash.conf file.

Let’s edit our first file requirements.txt.

Flask
amqp
celery
elasticsearch
elasticsearch_dsl
celstash

Now, add the following code in docker-compose.yml file.

 elasticsearch:
   image: docker.elastic.co/elasticsearch/elasticsearch:6.4.0
   container_name: elasticsearch
   ports:
     - "9200:9200"
     - "9300:9300"
 logstash:
   image: docker.elastic.co/logstash/logstash:6.4.0
   container_name: logstash
   volumes:
     - ./logstash/logstash.conf:/usr/share/logstash/pipeline/logstash.conf
   depends_on: ['elasticsearch']

Now, create a logstash directory and in this directory create a new file logstash.conf and add the following code.

input {
   udp {
       codec => json
       'port' => "9999"
   }
}
output {
   elasticsearch {
       hosts => ['elasticsearch']
   }
}

Now, we edit our worker files workerA.py and WorkerB.py, and add the following code in these files.

from celery import Celery
import celstash
import logging


# Celstash Initialization
celstash.configure(logstash_host='logstash', logstash_port=9999)
logger = celstash.new_logger('flask-celery')
logger.setLevel(logging.INFO)

# Celery configuration
.....

Managing Python flask Services

After editing and writing all this code stop the containers.

docker-compose down

And again create a build and start all the containers.

docker-compose build
docker-compose up -d

To check the new add containers.

docker-compose ps

And for logs.

docker-compose logs

Elasticsearch will be available on localhost port 9200 and logstash on port 9999.