Logs are not displaying in order

Hi All,

I have configured ELK with Layer 7 logs locally by taking the logs from gateway. I am able to get the logs in Kibana dashboard but the logs are not displayed in order. Below log is complete one entry but in kibana it is dividing multiple lines.

2017-03-28T10:23:59.877+0200 INFO 339 com.l7tech.log.custom.Policy: -4: LmyZhzUf0KcAAA5tHjsAAAGI, Auth request validation succeeded: {
"service" : "/servicefunctions/a/v1",
"resource" : "/TGD53579841A014354/acx/userprofiles?correlationid=fw57238rf-9eb0-4c60-bf12-fskfyhasf&trackingid=fslifysfyeofy&TrackingId=falfyeqlfyqeofyqI",
"requestMethod" : "GET",
"ciamid" : "",
"connectid" : "f361289bb98c4821aa8523c32d60740f",
"caller" : "xxx"
}.

How can i display above log in kibana same as my gateway?
Please suggest me if any filters required in logstash.

Regards
Raja

Sounds like you want to use the multiline codec to join multiple physical lines into a single logical event.

Thanks Magnus,

Could you please provide some guidance to make the configuration.

Regards
Raja

See https://www.elastic.co/guide/en/logstash/current/multiline.html.

Thank you Magnus,

I have gone thru the given site and made the configuration but still i am not seeing the logs in single event. Please find the configuration below.

input
{
file
{
path => "/etc/logstash/*.log"
codec => multiline
{
pattern => "^%{TIMESTAMP_ISO8601} "
negate => true
what => previous
}
start_position => "beginning"
}
}
filter
{
grok
{
match => {"messge" => "%COMBINEDAPACHELOG}"}
}
date
{
match => ["timestamp","dd/MM/yyyy:HH:mm:ss Z"]
}
}
output
{
elasticsearch
{
hosts => ["localhost:9200"]
user => "logstash_internal"
password => "changeme"
}
stdout { codec => rebydebug}
}

Regards
Raja

Your multiline codec looks reasonable. I don't know what's up here.

Ohh k Magnus. Thanks for your confirmation.

Could someone help me on this.

Regards
Raja

I can see a couple of spelling mistakes in the file (messge and rebydebug). Your log is also not a COMBINEDAPACHELOG, so I doubt your grok expression will work. What output are you seeing? What does the resulting events look like when output to stdout with a rubydebug codec?

Hi Christian,

Thanks for your reply,
Configuration file has correct spellings. While copy and paste then add something some spellings got missed.
Yes, My log is not an COMBINEDAPACHELOG but i dont know what to give in grok expression. Please guide me on this.

I am seeing the logs in Kibana dashboard but not event wise. It is showing row wise. Like i said above i need complete event to be displayed in each.

Regards
Raja

Have you read through the documentation? When building grok expression it is often recommended to do so gradually. Start with a simple expression that captures the first field and uses a GREEDYDATA pattern to capture the rest. Test it, and then gradually expand on the pattern until all data has been captured.

As you have a JSON object at the end, you may want to capture all fields up to that point and store the whole JSON object in a field so that you can apply the json filter on this after the grok filter has completed.

Thank you christian,

I have gone thru the docs and made the configuration but while making the grok and multiline codec configuration , i am not understanding how can i proceed.So i posted the question here to understand and get some configuration to do it but still it didnt working out.

My main goal is to display the logs as per the event but not per the row.

Regards
Raja

If you have gone through the documentation and have started building your grok pattern(s), you must surely have something else than the configuration file above that you can share?

Hi Christian,

Below grok pattern i make now but even though logs are displaying in same order. Its not displaying multi line.

input
{
file
{
path => "/etc/logstash/*.log"
codec => multiline
{
pattern => "^%{TIMESTAMP_ISO8601} "
negate => true
what => previous
}
start_position => "beginning"
}
}
filter
{
grok
{
match => {"message" => "%{SPACE}%{NOTSPACE}%{SPACE}%{GREEDYDATA}"}
}
}
output
{
elasticsearch
{
hosts => ["localhost:9200"]
user => "logstash_internal"
password => "changeme"
}
stdout { codec => rebydebug}
}

Above grok pattern means how the logs are there in gateway. Please suggest me if anything wrong.

Regards
Raja

When debugging Logstash config it often helps to remove the Elasticsearch output and work with the representation written to stdout using the rubydebug codec (still not correct in your config). Please show us an example event written to stdout so we can see exactly what the events you are talking about looks like.

Sorry Christian,

Its copied from previous configuration. Please find the exact configuration below.

input {
file{

path=> "/etc/logstash/*.log"
codec => multiline{
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => previous
}
start_position => "beginning"
}
}
filter{
grok{

match => {"message" => "%{SPACE}%{NOTSPACE}%{SPACE}%{GREEDYDATA}"}

}

date{
match=> ["timestamp","dd/MMM/yyyy:HH:mm:ss Z"]
}
}
output{
elasticsearch {

hosts => ["localhost:9200"]
user => "logstash_internal"
password => "changeme"

}
}

My Log is looks like below.

2017-03-28T10:23:59.876+0200 INFO 339 com.l7tech.log.custom.PepPolicy: -4: , LmyZhzUf0KcAAA5tHjsAAAGI, Common headers/url parameters are valid. TrackingId: LmyZhzUf0KcAAA5tHjsAAAGI, ApplicationName: ABC.
2017-03-28T10:23:59.877+0200 INFO 339 com.l7tech.log.custom.PepPolicy: -4: , LmyZhzUf0KcAAA5tHjsAAAGI, Generated cacheKey from request.
2017-03-28T10:23:59.877+0200 INFO 339 com.l7tech.log.custom.PepPolicy: -4: , LmyZhzUf0KcAAA5tHjsAAAGI, Auth request validation succeeded: {
"service" : "/int/mototrs/service/va/v1",
"resource" : "/motors/WDD2132641A014354/adx/userprofiles?correlationid=806ca322-9eb0-4c60-bf12-ae2edf24f533&trackingid=LmyZh5afjahfhaGI&TrackingId=LmyZhfagfiafhyaiAAGI",
"requestMethod" : "GET",
"ciamid" : "",
"connectid" : "f361289bb98c4821aa8523c32d60740f",
"caller" : "xxx"
}.
2017-03-28T10:23:59.877+0200 INFO 339 com.l7tech.log.custom.PepPolicy: -4: , LmyZhzUf0KcAAA5tHjsAAAGI, Found cache entry with value '{"granted":true,"reason":"User is whitelisted. - Grant Context: [request_method=GET], [service=/int/motor/service/va/v1], [resource=/motors/WDD2132641A014354/idx/userprofiles?correlationid=e5h52w5-5ed3-81c1-83db-e97hif8b77d1&trackingid=LmyZhzUf0KcAAA5tHjsAAAGI&TrackingId=LmyZhzUf0KcAAA5tHjsAAAGI], [ciam_id=], [connect_id=[value=f361289bb98cfa683hajdf32d60740f,age=0ms]], [tracking_id=LmyZhzUf0fwlkffuyGI], [application_name=ABC], [drd_uid=ABC_PERS_00]","obligations":{},"advices":{"cacheable":true,"cacheTimeToLiveSeconds":120}}'

Regards
Raja

And what do you get from a stdout { codec => rubydebug } output?

The pattern in your date filter doesn't match reality so the @timestamp field (used by Kibana for sorting) won't get the correct value from the log. Try "ISO8601" instead of "dd/MMM/yyyy:HH:mm:ss Z".

Please replace the elasticsearch output with stdout as described earlier and show us the output from this. This makes debugging a lot quicker and easier.

Hi Christian,

My output is looks like below in kibana.

and the output configuration is like below.

output{
elasticsearch {

hosts => ["localhost:9200"]
user => "logstash_internal"
password => "changeme"

}
stdout { codec => rubydebug }
}

Regards
Raja

Please copy and paste the output from the stdout filter you have configured. This will show us the structure and content of your events and make it all much easier to debug.

Hi Magnus,

I have added the timestamp like below.
date{
match => {"timestamp" , "ISO8601"}
}
I dont find the difference.

Regards
Raja