Hi there,
I am trying to get a proof of concept up and running with ELK to see if it could be a replacement for our Splunk installation. There is a particular log line in our tomcat application that gets logged for each "transaction" that contains the following -
"LOG:: SOMEMORETEXT" and then a JSON string with lots of key value pairs. These pairs are changeable between transactions i.e. some will print a certain key - others not.
I need to ensure the timestamp, thread number and GUID and each key value pair within the braces is indexed. I was able to get the JSON extraction working if changed the app to just log the JSON with the grok configuration below, but am finding it very difficult to make progress on my end goal.
Could anyone please provide me some some assistance.
== Sample Log Line ==
LOG::2016-03-09 18:00:18,721 [http-nio-8080-exec-2] INFO [3aec689cf6e74c88b1ae0c40887cdfd5 StandardLifeCycle] - JSONDOC{"GUID":"3aec689cf6e74c88b1ae0c40887cdfd5","timestamp":"20160309180056","host":"server1","category":"5","txn_time":"10","customer":"test"}
== Sample Conf ==
input
{
beats {
port => 5044
}
}
filter
{
mutate
{
replace => [ "message", "%{message}" ]
gsub => [ 'message','\n','']
}
if [message] =~ /^{.*}$/
{
json { source => message }
}
}
output
{
elasticsearch {
codec => json
hosts => ["localhost:9200"]
index => "json"
}
stdout { codec => rubydebug }
}