Expand Authors Top

If you have a few years of experience in the Java ecosystem and you’d like to share that with the community, have a look at our Contribution Guidelines.

Expanded Audience – Frontegg – Security (partner)
announcement - icon User management is very complex, when implemented properly. No surprise here.

Not having to roll all of that out manually, but instead integrating a mature, fully-fledged solution - yeah, that makes a lot of sense.
That's basically what Frontegg is - User Management for your application. It's focused on making your app scalable, secure and enjoyable for your users.
From signup to authentication, it supports simple scenarios all the way to complex and custom application logic.

Have a look:

>> Elegant User Management, Tailor-made for B2B SaaS

November Discount Launch 2022 – Top
We’re finally running a Black Friday launch. All Courses are 30% off until tomorrow:

>> GET ACCESS NOW

November Discount Launch 2022 – TEMP TOP (NPI)
We’re finally running a Black Friday launch. All Courses are 30% off until tomorrow:

>> GET ACCESS NOW

1. Overview

In this quick tutorial, we'll discuss, step by step, how to send out application logs to the Elastic Stack (ELK).

In an earlier article, we focused on setting up the Elastic Stack and sending JMX data into it.

2. Configure Logback

let's start by configuring Logback to write app logs into a file using FileAppender:

<appender name="STASH" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>logback/redditApp.log</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
        <fileNamePattern>logback/redditApp.%d{yyyy-MM-dd}.log</fileNamePattern>
        <maxHistory>7</maxHistory>
    </rollingPolicy>  
    <encoder class="net.logstash.logback.encoder.LogstashEncoder"/>
</appender>
<root level="DEBUG">
    <appender-ref ref="STASH" />        
</root>

Note that:

  • We keep logs of each day in a separate file by using RollingFileAppender with TimeBasedRollingPolicy (more about this appender here)
  • We'll keep old logs for only a week (7 days) by setting maxHistory to 7

Also, notice how we're using the LogstashEncoder to do the encoding into a JSON format – which is easier to use with Logstash.

To make use of this encoder, we need to add the following dependency to our pom.xml:

<dependency> 
    <groupId>net.logstash.logback</groupId> 
    <artifactId>logstash-logback-encoder</artifactId> 
    <version>4.11</version> 
</dependency>

Finally, let's make sure the app has permissions to access logging directory:

sudo chmod a+rwx /var/lib/tomcat8/logback

3. Configure Logstash

Now, we need to configure Logstash to read data from log files created by our app and send it to ElasticSearch.

Here is our configuration file logback.conf:

input {
    file {
        path => "/var/lib/tomcat8/logback/*.log"
        codec => "json"
        type => "logback"
    }
}

output {
    if [type]=="logback" {
         elasticsearch {
             hosts => [ "localhost:9200" ]
             index => "logback-%{+YYYY.MM.dd}"
        }
    }
}

Note that:

  • input file is used as Logstash will read logs this time from logging files
  • path is set to our logging directory and all files with .log extension will be processed
  • index is set to new index “logback-%{+YYYY.MM.dd}” instead of default “logstash-%{+YYYY.MM.dd}”

To run Logstash with new configuration, we'll use:

bin/logstash -f logback.conf

4. Visualize Logs Using Kibana

We can now see our Logback data in the ‘logback-*‘ index.

We'll create a new search ‘Logback logs' to make sure to separate Logback data by using the following query:

type:logback

Finally, we can create a simple visualization of our Logback data:

  • Navigate to ‘Visualize' tab
  • Choose ‘Vertical Bar Chart'
  • Choose ‘From Saved Search'
  • Choose ‘Logback logs' search we just created

For Y-axis, make sure to choose Aggregation: Count

For X-axis, choose:

  • Aggregation: Terms
  • Field: level

After running the visualization, you should see multiple bars represent a count of logs per level (DEBUG, INFO, ERROR, …)

5. Conclusion

In this article, we learned the basics of setting up Logstash in our system to push the log data it generates into Elasticsearch – and visualize that data with the help of Kibana.

November Discount Launch 2022 – Bottom
We’re finally running a Black Friday launch. All Courses are 30% off until tomorrow:

>> GET ACCESS NOW

Persistence footer banner
2 Comments
Oldest
Newest
Inline Feedbacks
View all comments
Comments are closed on this article!