Tuesday, 25 February 2020

ELK Demo Windows on Simple Java and Spring Boot Services


Why Centralized Logging ELK in Distributed System ?

Log Consolidation or Log Streaming which approach you had chosen for your microservices architecture ?

(In my project, I had choose log consolidation means every service had log file, then logstash point out the log file read and put into elastic)

(In log streaming, every service no need to maintain log file, then immediate push the data into apache stream)

1) Generally in monolith application we are restricting the logs in production / development due to file size it will generate.

2) In Distributed System Assume Transaction T1 involves 3 microservices then
    1) 1st Microservice it's own logs - india data center
    2) 2nd Microservice it's own logs - american data center
   3) 3rd Microservice it's own logs - Germany data center

Then how you will trace logs. It's very difficult. So Solution will be centralized logging.




The components of the ELK stack are:
Elasticsearch ( 2.3.5)  – Search and analyze data in real time.
Logstash (2.3.4) – Collect, enrich, and transport data.
Kibana (4.5.4) – Explore and visualize data

Install the JDK 8, make the environment variable JAVA_HOME=JDK DIRECTORY

Create a folder to keep the ELK components grouped: D:\ELK
This directory will be used to keep all ELK components grouped in the same folder.

Configure Elasticsearch
Elasticsearch Folder : D:\ELK\elasticsearch-2.3.5
In order to make Elasticsearch REST API only available within the ELK machine, we need to make one modification inside the “D:\ELK\elasticsearch-2.3.5\config\elasticsearch.yml” file:
Find the line:
 #network.host:192.168.0.1
Replace by the line
Network.host:127.0.0.1

Goto Command Prompt below folder:   D:\ELK\elasticsearch-2.3.5\bin>elasticsearch.bat

Now Elasticsearch service working on localhost:9200

Configure Logstash
Logstash folder: “D:\ELK\logstash-2.3.4”.
Now create a config file logback-listener.conf and put it under “D:\ELK\logstash-2.3.4\bin”
logback-listerner.conf

input {
tcp {
port => 5050
codec => "json"
mode => "server"
}
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
index => "logstash-%{+YYYY.MM.dd}"
}
stdout { }
}

Here 2 usecases are there
Simple Java Program – using logback-listerner.conf
Spring Boot Microservice – using logstash.conf

Goto Command Prompt
D:\ELK\logstash-2.3.4\bin> logstash -f logback-listener.conf

logstash tcp listener port is 5050

Configure Kibana
 Kibana folder:  “D:\ELK\kibana-4.5.3-windows”.
 Modify “D:\ELK\kibana-4.5.3-windows\config\kibana.yml” file like below

server.host: "0.0.0.0"
 And replace by the line:
server.host:127.0.0.1


Simple Java Program Example:

Requires Jars:
cal10n-api-0.8.1.jar
logback-classic-1.1.7.jar
logback-core-1.1.7.jar
jackson-core-2.10.2.jar
jackson-databind-2.10.2.jar
jackson-annotations-2.10.2.jar
logstash-logback-encoder-4.7.jar (   Configure LogstashTcpSocketAppender
LogstashTcpSocketAppender is a specialized log appender created by the Logstash team and distributed as a Maven dependency. This library has all the necessary logic to send your log messages, formatted as JSON, to a remote server over the TCP protocol.
)


Once the mentioned jars is available in our classpath we can now configure Logback using logback.xml.


<?xml version="1.0" encoding="UTF-8"?>
<configuration debug="true">
            <appender name="stash"
                        class="net.logstash.logback.appender.LogstashTcpSocketAppender">
                        <destination>127.0.0.1:5050</destination>
                        <!-- encoder is required -->
                        <encoder class="net.logstash.logback.encoder.LogstashEncoder" />
            </appender>
            <root level="DEBUG">
                        <appender-ref ref="stash" />
            </root>
</configuration>

Below Example I had taken from eloquentdeveloper.com


import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
 /**
 * @author eloquentdeveloper.com
 */
public class MyClass {

            private static final Logger LOGGER = LoggerFactory.getLogger(MyClass.class);

            public static void main(String[] args) throws Exception {
                        for (int i = 0; i < 10; i++) {
                                    LOGGER.info("New customer successfully registered");
                                    LOGGER.warn("User password will expire in two days");
                                    LOGGER.error("Billing system is not available");
                                    Thread.sleep(200);
                        }
            }
}

Ramesh Analysis:
Here logback.xml plays key role, it will append all the logs into logstash tcp 127.0.0.1:5050.
Means Once user runs the application , these logs will be append into logstash tcp, then logstash will put these logs into elasticsearch.
Now Kibana connects to elasticsearch to show the results.


Check the log entries using Kibana




Simple Java Class Summary:
Java application sends log files to logstash, logstash listens tcp port on 5050, then accepts the input redirect to output elasticsearch.Now kibana enquires the elasticsearch.


Usecase 2: SpringBoot Microservices (Refer: https://www.javainuse.com/spring/springboot-microservice-elk)

Please refer javainuser for SpringBoot Project for this demo and there is no need to change our ELK stack.


  • Now stop the logstash
  • Now configure the new logstash.conf
input {
  file {
    type => "java"
    path => "C:/elk/spring-boot-elk.log"
    codec => multiline {
      pattern => "^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}.*"
      negate => "true"
      what => "previous"
    }
  }
}

filter {
  #If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
  if [message] =~ "\tat" {
    grok {
      match => ["message", "^(\tat)"]
      add_tag => ["stacktrace"]
    }
  }

}

output {
  stdout {
    codec => rubydebug
  }

  # Sending properly parsed log events to elasticsearch
  elasticsearch {
    hosts => ["localhost:9200"]
  }
}


  • Kept the logstash.conf under C:\ELK\logstash-2.3.4\bin
  • Open the Command Prompt, C:\ELK\logstash-2.3.4\bin> logstash  -f logstash.conf
  • Now the logstash takes input from C:/elk/spring-boot-elk.log and output elasticsearch


Testing the SpringBoot Services and Check the Kibana

Open the browser: http://localhost:8080/elk, for this logs stored C:/elk/spring-boot-elk.log
Now logstash takes input of this C:/elk/spring-boot-elk.log, then output to elasticsearch
Now kibana connect to elasticsearch










No comments:

Post a Comment