This is the setup:
I'm running ELK locally with a java spring boot app that writes logs to a file called elk-logs.log
I'm able to see the logs in kibana when I first run the java spring boot app, but after the initial loading of the app, I'm unable to see new logs afterwards. The logs are being written to the log file (called elk-logs.log) in real-time, but I believe logstash is not reading them properly.
logstash.conf:
input {
file {
type => "java"
path => "/Users/coinflex/sandbox/elk-demo/elk-logs.log"
codec => multiline {
pattern => "^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}.*"
negate => "true"
what => "previous"
}
}
}
filter {
#If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
if [message] =~ "\tat" {
grok {
match => ["message", "^(\tat)"]
add_tag => ["stacktrace"]
}
}
grok {
match => [ "message",
"(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{NUMBER:pid} --- \[(?<thread>[A-Za-z0-9-]+)\] [A-Za-z0-9.]*\.(?<class>[A-Za-z0-9#_]+)\s*:\s+(?<logmessage>.*)",
"message",
"(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{NUMBER:pid} --- .+? :\s+(?<logmessage>.*)"
]
}
date {
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSS" ]
}
}
output {
stdout {
codec => rubydebug
}
# Sending properly parsed log events to elasticsearch
elasticsearch {
hosts => ["localhost:9200"]
}
}
logstash log:
[elk-demo]$ logstash -f logstash.conf
Sending Logstash logs to /usr/local/Cellar/logstash-full/7.7.1/libexec/logs which is now configured via log4j2.properties
[2020-06-17T13:35:25,645][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-06-17T13:35:25,773][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.7.1"}
[2020-06-17T13:35:28,114][INFO ][org.reflections.Reflections] Reflections took 36 ms to scan 1 urls, producing 21 keys and 41 values
[2020-06-17T13:35:30,917][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-06-17T13:35:31,160][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-06-17T13:35:31,252][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-06-17T13:35:31,257][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-06-17T13:35:31,351][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2020-06-17T13:35:31,402][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-06-17T13:35:31,555][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-06-17T13:35:31,714][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2020-06-17T13:35:31,720][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["/Users/coinflex/sandbox/elk-demo/logstash.conf"], :thread=>"#<Thread:0x6408d58d run>"}
[2020-06-17T13:35:33,055][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/local/Cellar/logstash-full/7.7.1/libexec/data/plugins/inputs/file/.sincedb_78ed3a686c174383ea40710c53eb335c", :path=>["/Users/coinflex/sandbox/elk-demo/elk-logs2.log"]}
[2020-06-17T13:35:33,086][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-06-17T13:35:33,180][INFO ][filewatch.observingtail ][main][04f5a2d419a3a9933b66a645795c68846c8574668ff5844da1e11f62e4874972] START, creating Discoverer, Watch with file and sincedb collections
[2020-06-17T13:35:33,180][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-06-17T13:35:33,533][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
The spring boot app is really simple. Just a main ELkDemoApplication class and an ElkController class with some route that I use to trigger the logger after visiting localhost:8080/
ElkDemoApplication.java:
package com.elk.demo;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.security.servlet.SecurityAutoConfiguration;
@SpringBootApplication (exclude = SecurityAutoConfiguration.class)
public class ElkDemoApplication {
public static void main(String[] args) {
SpringApplication.run(ElkDemoApplication.class, args);
}
}
ElkController.java:
package com.elk.demo.controller;
import java.io.PrintWriter;
import java.io.StringWriter;
import java.util.Date;
import org.apache.log4j.Level;
import org.apache.log4j.Logger;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.core.ParameterizedTypeReference;
import org.springframework.http.HttpMethod;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.client.RestTemplate;
@RestController
class ELKController {
private static final Logger LOG = Logger.getLogger(ELKController.class.getName());
@Autowired
RestTemplate restTemplete;
@Bean
RestTemplate restTemplate() {
return new RestTemplate();
}
@RequestMapping(value = "/elkdemo")
public String helloWorld() {
String response = "Hello user ! " + new Date();
LOG.log(Level.INFO, "/elkdemo - > " + response);
return response;
}
@RequestMapping(value = "/elk")
public String helloWorld1() {
String response = restTemplete.exchange("http://localhost:8080/elkdemo", HttpMethod.GET, null, new ParameterizedTypeReference<String>() {
}).getBody();
LOG.log(Level.INFO, "/elk - > " + response);
try {
String exceptionrsp = restTemplete.exchange("http://localhost:8080/exception", HttpMethod.GET, null, new ParameterizedTypeReference<String>() {
}).getBody();
LOG.log(Level.INFO, "/elk trying to print exception - > " + exceptionrsp);
response = response + " === " + exceptionrsp;
} catch (Exception e) {
// exception should not reach here. Really bad practice :)
}
return response;
}
@RequestMapping(value = "/exception")
public String exception() {
String rsp = "";
try {
int i = 1 / 0;
// should get exception
} catch (Exception e) {
e.printStackTrace();
LOG.error(e);
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
e.printStackTrace(pw);
String sStackTrace = sw.toString(); // stack trace as a string
LOG.error("Exception As String :: - > "+sStackTrace);
rsp = sStackTrace;
}
return rsp;
}
}
log4j.properties:
log4j.rootLogger=DEBUG, consoleAppender, fileAppender
log4j.appender.consoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.consoleAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.consoleAppender.layout.ConversionPattern=[%t] %-5p %c %x - %m%n
log4j.appender.fileAppender=org.apache.log4j.RollingFileAppender
log4j.appender.fileAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.fileAppender.layout.ConversionPattern=[%t] %-5p %c %x - %m%n
log4j.appender.fileAppender.File=elk-logs.log