Configuring Kibana and ElasticSearch for Log Analysis with Fluentd on Docker Swarm - Little Big Extra Skip to main content

Configuring Kibana and ElasticSearch for Log Analysis with Fluentd on Docker Swarm

Using Kibana and ElasticSearch for Log Analysis with Fluentd on Docker Swarm

Introduction

In my previous post, I talked about how to configure fluentd for logging for multiple Docker containers. The post explained how to create a single file for each micro service irrespective of its multiple instances it could have.
However, Log files have limitations it is not easy to extract analysis or find any trends.

Elastic Search and Splunk have become very popular in recent years as they give you allow you to events in real-time, visualise trends and search through logs.

Elastic Search and Kibana

Elastic Search is an open source search engine based on Apache Lucene.It is an extremely fast search engine and is commonly used for log analytics, full-text search and much more.
Along with Kibana,  which is a visualisation tool, Elasticsearch can be used for real-time analytics. With Kibana you can create intuitive charts and reports, filters, aggregations and trends based on data.

Changing the fluent.conf

Since this post is continuation of previous post, I will show you how to modify the fluent.conf for elastic search changes

All we need to do is that we need to add another “store” block like below

In the above config, we are telling that elastic search is running on port 9200 and the host is elasticsearch (which is docker container name). Also we have defined the general Date format and flush_interval has been set to 1s which tells fluentd to send records to elasticsearch after every 1sec.

This is how the complete configuration will look like

Create Dockerfile with our custom configuration

So the next step is to create a custom image of fluentd which has the above configuration file.
Save above file as fluent.conf in a folder named conf and then create a file called DockerFile at the same level as conf folder

In this Docker file as you can see we are replacing the fluent.conf in the base image with the version of ours and also installing elastic search plugin

Now let us create a Docker image by
run "docker build -t ##YourREPOname##/myfluentd:latest ."

and then push it to the docker hub repository
"docker push ##YourREPOname##/myfluentd"


Elastic Search and Kibana Docker Image

This is how the YAML configuration for ElasticSearch looks like

Things to note here for this elastic search configuration file

  • The name of the container is set to elasticsearch, this is same to whta has been mentioned in fluent.conf
  • Port 9200 has been exposed to access elastic search locally too
  • The environment variables are important here as they restrict the max heap space this container can have and disable any Elasticsearch memory from being swapped out.
  • Using Logging Driver as JSON File as I want to restrict the log file size
  • Storing the elastic search data on a directory called “esdata” so data can persist between container restarts

This is how the YAML configuration for Kibana looks like

Things to note here for this kibana configuration file

  • The port 5601 has been exposed locally to access Kibana dashboard

Complete Config file

So now we need a complete docker-compose file which will have whoami service with multiple instances, docker visualiser service along with elastic, kibana and fluentd services.
This is how the complete files looks like

You can run above services on docker swarm by using below command, make sure you save the file by the name docker-swarm.yml

Accessing kibana and elasticsearch

Once you make sure that all services are up and running using

you can access Kibana on
http://##domain#Or#IP##:5601/
Once You see something similar like below click on Create Button and then Discover on top left, you should see some bars indicating logs

Kibana Dashboard on Docker Swarm
Now if you click on http://##domain#Or#IP##/hello the whomai container will generate some logs which should appear on Kibana provided the right time has been chosen and auto-refresh has been enabled. See the screenshot below.

Kibana offers a lot of ways to create the visualization like charts, graphs etc for log analysis which can be explored further.

Logs in Elastic search + Kibana dashboard

Drop in suggestions or comments for any feedback.

Follow this Video for demonstration

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Bitnami