Course – LS – All

Get started with Spring and Spring Boot, through the Learn Spring course:

>> CHECK OUT THE COURSE

1. Overview

In this quick tutorial, we’ll discuss, step by step, how to send out application logs to the Elastic Stack (ELK).

In an earlier article, we focused on setting up the Elastic Stack and sending JMX data into it.

2. Configure Logback

let’s start by configuring Logback to write app logs into a file using FileAppender:

<appender name="STASH" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>logback/redditApp.log</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
        <fileNamePattern>logback/redditApp.%d{yyyy-MM-dd}.log</fileNamePattern>
        <maxHistory>7</maxHistory>
    </rollingPolicy>  
    <encoder class="net.logstash.logback.encoder.LogstashEncoder"/>
</appender>
<root level="DEBUG">
    <appender-ref ref="STASH" />        
</root>

Note that:

  • We keep logs of each day in a separate file by using RollingFileAppender with TimeBasedRollingPolicy (more about this appender here)
  • We’ll keep old logs for only a week (7 days) by setting maxHistory to 7

Also, notice how we’re using the LogstashEncoder to do the encoding into a JSON format – which is easier to use with Logstash.

To make use of this encoder, we need to add the following dependency to our pom.xml:

<dependency> 
    <groupId>net.logstash.logback</groupId> 
    <artifactId>logstash-logback-encoder</artifactId> 
    <version>4.11</version> 
</dependency>

Finally, let’s make sure the app has permissions to access logging directory:

sudo chmod a+rwx /var/lib/tomcat8/logback

3. Configure Logstash

Now, we need to configure Logstash to read data from log files created by our app and send it to ElasticSearch.

Here is our configuration file logback.conf:

input {
    file {
        path => "/var/lib/tomcat8/logback/*.log"
        codec => "json"
        type => "logback"
    }
}

output {
    if [type]=="logback" {
         elasticsearch {
             hosts => [ "localhost:9200" ]
             index => "logback-%{+YYYY.MM.dd}"
        }
    }
}

Note that:

  • input file is used as Logstash will read logs this time from logging files
  • path is set to our logging directory and all files with .log extension will be processed
  • index is set to new index “logback-%{+YYYY.MM.dd}” instead of default “logstash-%{+YYYY.MM.dd}”

To run Logstash with new configuration, we’ll use:

bin/logstash -f logback.conf

4. Visualize Logs Using Kibana

We can now see our Logback data in the ‘logback-*‘ index.

We’ll create a new search ‘Logback logs’ to make sure to separate Logback data by using the following query:

type:logback

Finally, we can create a simple visualization of our Logback data:

  • Navigate to ‘Visualize’ tab
  • Choose ‘Vertical Bar Chart’
  • Choose ‘From Saved Search’
  • Choose ‘Logback logs’ search we just created

For Y-axis, make sure to choose Aggregation: Count

For X-axis, choose:

  • Aggregation: Terms
  • Field: level

After running the visualization, you should see multiple bars represent a count of logs per level (DEBUG, INFO, ERROR, …)

5. Conclusion

In this article, we learned the basics of setting up Logstash in our system to push the log data it generates into Elasticsearch – and visualize that data with the help of Kibana.

Course – LSD (cat=Persistence)

Get started with Spring Data JPA through the reference Learn Spring Data JPA course:

>> CHECK OUT THE COURSE
res – Persistence (eBook) (cat=Persistence)
Comments are open for 30 days after publishing a post. For any issues past this date, use the Contact form on the site.