Configuring ELK stack to analyse Apache Tomcat logs
In this post, we will set up ElasticSearch, Logstash and Kibana to analyse Apache Tomcat server logs. Before setting up ELK stack, let’s have a brief about each.
ElasticSearch
Schema-less database that has powerful search capabilities and is easy to scale horizontally. Indexes every single field, aggregate and group the data.
Logstash
Written in Ruby and allows us to pipeline data to and from anywhere. An ETL pipeline which allows to fetch, transform, and store events into ElasticSearch. Packaged version runs on JRuby and takes advantage of the JVM’s threading capabilities by throwing dozens of threads to parallelize data processing.
Kibana
Web based data analysis and dashboarding tool for ElasticSearch. Leverages ElasticSearch’s search capabilities to visualise data in seconds. Supports Lucene Query String syntax and Elasticsearch’s filter capabilities.
Next, we will start with installing each component from stack seperately, following below steps:
Step 1: Download and extract ElasticSearch .tar.gz file in a directory, for me it’s elasticsearch-2.1.0.tar.gz extracted in directory named elasticsearch under directory /Users/ArpitAggarwal/
Step 2: Start elasticsearch server moving to bin folder and executing ./elasticsearch as follows:
$ cd /Users/ArpitAggarwal/elasticsearch/elasticsearch-2.1.0/bin $ ./elasticsearch
Above command start elasticsearch accessible at http://localhost:9201/ and default indexes accessible at http://localhost:9201/_cat/indices?v
For deleting indexes, hit a curl from command line as follows:
curl -XDELETE 'http://localhost:9201/*/'
Stpe 3: Next, we will install and configure Kibana to point to our ElasticSearch instance, for doing the same Download and extract the .tar.gz file in a directory, for me it’s kibana-4.3.0-darwin-x64.tar.gz extracted in directory named kibana under directory /Users/ArpitAggarwal/
Step 4: Modify the kibana.yml under directory /Users/ArpitAggarwal/kibana/kibana-4.3.0-darwin-x64/config/kibana.yml to point to our local ElasticSearch instance by replacing existing elasticsearch.url value to http://localhost:9201
Step 5: Start Kibana moving to bin folder and executing ./kibana as follows:
$ cd /Users/ArpitAggarwal/kibana/kibana-4.3.0-darwin-x64/bin $ ./kibana
Above command start Kibana accessible at http://localhost:5601/
Step 6: Next, we will install and configure Nginx to point to our local Kibana instance, for doing the same Download Nginx in a directory, for me it’s nginx under /Users/ArpitAggarwal/ unzip the nginx-*.tar.gz and install it using command:
$ cd nginx-1.9.6 $ ./configure $ make $ make install
By default, Nginx will be installed in directory /usr/local/nginx, but Nginx provides the ability to specify a directory where it is to be installed, and same you can do it by providing additional compile option – –prefix as follows:
./configure --prefix=/Users/ArpitAggarwal/nginx
Next, open the nginx configuration file at /Users/ArpitAggarwal/nginx/conf/nginx.conf and replace location block under server with below content:
location / { # point to Kibana local instance proxy_pass http://localhost:5601; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection 'upgrade'; proxy_set_header Host $host; proxy_cache_bypass $http_upgrade; }
Step 7: Start Nginx, as follows:
cd /Users/ArpitAggarwal/nginx/sbin ./nginx
Above command start the nginx server accessible at http://localhost
Step 8: Next, we will install Logstash, executing below commands:
ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" < /dev/null 2> /dev/null brew install logstash
Above command install Logstash at location /usr/local/opt/
Step 9: Now, we will configure Logstash to push data from Tomcat server logs directory to ElasticSearch. To do the same, create a directory where we will create our logstash configuration file, for me it’s logstash created under directory /Users/ArpitAggarwal/ as follows:
cd /Users/ArpitAggarwal/ mkdir logstash patterns cd logstash touch logstash.conf cd ../patterns touch grok-patterns.txt
Copy the below content to logstash.conf:
input { file { path => "/Users/ArpitAggarwal/tomcat/logs/*.log*" start_position => beginning type=> "my_log" } } filter { multiline { patterns_dir => "/Users/ArpitAggarwal/logstash/patterns" pattern => "\[%{TOMCAT_DATESTAMP}" what => "previous" } if [type] == "my_log" and "com.test.controller.log.LogController" in [message] { mutate { add_tag => [ "MY_LOG" ] } if "_grokparsefailure" in [tags] { drop { } } date { match => [ "timestamp", "UNIX_MS" ] target => "@timestamp" } } else { drop { } } } output { stdout { codec => rubydebug } if [type] == "my_log" { elasticsearch { manage_template => false host => localhost protocol => http port => "9201" } } }
Next, Copy the contents from file https://github.com/elastic/logstash/blob/v1.2.2/patterns/grok-patterns to patterns/grok-patterns.txt
Step 10: Validate logstash’s configuration file using below command:
$ cd /usr/local/opt/ $ logstash -f /Users/ArpitAggarwal/logstash/logstash.conf --configtest --verbose —debug
Step 11: Push data to ElasticSearch using Logstash as follows:
$ cd /usr/local/opt/ $ logstash -f /Users/ArpitAggarwal/logstash/logstash.conf
Reference: | Configuring ELK stack to analyse Apache Tomcat logs from our SCG partner Arpit Aggarwal at the Arpit Aggarwal blog. |
I’m a bit confused. I’m looking for a way to livestream logs from tomcat to kibana. How do i set it up so that it does it continously?