<

I just announced the new Spring 5 modules in REST With Spring:

>> CHECK OUT THE COURSE

1. Overview

In this quick tutorial, we’ll discuss, step by step, how to send out application logs to the Elastic Stack (ELK).

In an earlier article, we focused on setting up the Elastic Stack and sending JMX data into it.

2. Configure Logback

let’s start by configuring Logback to write app logs into a file using FileAppender:

<appender name="STASH" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>logback/redditApp.log</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
        <fileNamePattern>logback/redditApp.%d{yyyy-MM-dd}.log</fileNamePattern>
        <maxHistory>7</maxHistory>
    </rollingPolicy>  
    <encoder class="net.logstash.logback.encoder.LogstashEncoder"/>
</appender>
<root level="DEBUG">
    <appender-ref ref="STASH" />        
</root>

Note that:

  • We keep logs of each day in a separate file by using RollingFileAppender with TimeBasedRollingPolicy (more about this appender here)
  • We’ll keep old logs for only a week (7 days) by setting maxHistory to 7

Also, notice how we’re using the LogstashEncoder to do the encoding into a JSON format – which is easier to use with Logstash.

To make use of this encoder, we need to add the following dependency to our pom.xml:

<dependency> 
    <groupId>net.logstash.logback</groupId> 
    <artifactId>logstash-logback-encoder</artifactId> 
    <version>4.11</version> 
</dependency>

Finally, let’s make sure the app has permissions to access logging directory:

sudo chmod a+rwx /var/lib/tomcat8/logback

3. Configure Logstash

Now, we need to configure Logstash to read data from log files created by our app and send it to ElasticSearch.

Here is our configuration file logback.conf:

input {
    file {
        path => "/var/lib/tomcat8/logback/*.log"
        codec => "json"
        type => "logback"
    }
}

output {
    if [type]=="logback" {
         elasticsearch {
             hosts => [ "localhost:9200" ]
             index => "logback-%{+YYYY.MM.dd}"
        }
    }
}

Note that:

  • input file is used as Logstash will read logs this time from logging files
  • path is set to our logging directory and all files with .log extension will be processed
  • index is set to new index “logback-%{+YYYY.MM.dd}” instead of default “logstash-%{+YYYY.MM.dd}”

To run Logstash with new configuration, we’ll use:

bin/logstash -f logback.conf

4. Visualize Logs using Kibana

We can now see our Logback data in the ‘logback-*‘ index.

We’ll create a new search ‘Logback logs’ to make sure to separate Logback data by using the following query:

type:logback

Finally, we can create a simple visualization of our Logback data:

  • Navigate to ‘Visualize’ tab
  • Choose ‘Vertical Bar Chart’
  • Choose ‘From Saved Search’
  • Choose ‘Logback logs’ search we just created

For Y-axis, make sure to choose Aggregation: Count

For X-axis, choose:

  • Aggregation: Terms
  • Field: level

After running the visualization, you should see multiple bars represent a count of logs per level (DEBUG, INFO, ERROR, …)

5. Conclusion

In this article, we learned the basics of setting up Logstash in our system to push the log data it generates into Elasticsearch – and visualize that data with the help of Kibana.

I just announced the new Spring 5 modules in REST With Spring:

>> CHECK OUT THE LESSONS

  Subscribe  
newest oldest most voted
Notify of
Csaba
Guest
Csaba

We had thrown out logstash, since it was slow, and sometimes the grok just failed, out of the box. Ended up implementing our own client for Elastic Search.

Me
Admin
Me

That’s interesting. It’s not quite what my experience with the tool is, but it’s certainly worth exploring. If you’d like to do a quick writeup and detail the issues you had with it, feel free to shoot me an email.
Cheers,
Eugen.