Why ELK stack

Problem statement:

You have different applications feeding into each other. Debugging an issue requires logging into each individual box to look at the logs. With small number of apps/boxes it's not an issue, but it quickly becomes tedious as the number of apps/boxes increase!


It would be awesome to have all of your logs aggregated into one place so you can see the process flow and perform queries against the logs from all applications from one place.

Enter ELK stack. It will also improve your heart health because no need to log into 100 different boxes to follow the logs!

What is ELK stack?

ELK stands for Elasticsearch, Logstash and Kibana.

Brief definitions:

Logstash: It is a tool for managing events and logs. You can use it to collect logs, parse them, and store them for later use (like, for searching). Speaking of searching, logstash comes with a web interface for searching and drilling into all of your logs. It is fully free and fully open source.

Elasticsearch: Elasticsearch is a search server based on Lucene. It provides a distributed, multitenant-capable full-text search engine with a RESTful web interface and schema-free JSON documents.

Kibana: A nifty tool to visualize logs and timestamped data.

How do we use it?

Here's one possible (and recommended) architecture diagram that helps visualize the flow of log information:

As shown in the diagram: App 1...App n are generating logs. Logstash shipper* is reading the logs from a log file and sending the logs to a redis database. Logstash in the ELK stack is reading off of the redis database, feeding that information on to elasticsearch. Kibana is a configrable web dashboard that is querying elastic search for log information and presenting it to the user. Logstash of course can be configured to read logs/register events from a myriad of sources and send log events to multiple different sources. We will cover Logstash in more depth in a different post.

You can follow this post to learn how to install ELK stack on your server.