Not displaying fields specified in the Logstash filter

Hey,

I just created an index using Elasticsearch and Logstash but the fields I've specified using the Logstash filter (grok) aren't really reflecting in Kibana.

Here's my filter -

input {
file {
path => "C:/logstash-1.5.4/slt.log"
type => "sample"
start_position => "beginning"

}
}

filter {
grok {
match => { "message" => "%{DATESTAMP_RFC2822:time} %{NUMBER:temp} %{NUMBER:light} %{USERNAME:node}"}
}
}

output {

elasticsearch {
protocol => "http"
}

stdout {}

}

Please see attached my screenshot of all the fields getting condensed in the "message".

Any help would be appreciated!

I am not sure I understand the problem. The message field has a timestamp, number, number and username. Could you provide an example of what you want the message field to look like.

I want the fields 'time', 'temp', 'light' and 'node' to be separated in the message just like I've specified in the grok filter. Any idea how that can be done?

You didn't specify that in the grok filter.

Try something like this:

filter {
  grok { match => { "message" => [ "Duration: %{NUMBER:duration}", "Speed: %{NUMBER:speed}" ] } }
}

For more, refer to the documentation

Hey,

Still getting the same output without any splitting up of the 'message'. See attached image.

Here's my new config -

input {
file {
path => "C:/logstash-1.5.4/TempLight.log"
type => "sample"
start_position => "beginning"

}
}

filter {
grok { match => { "message" => [ "Time: %{DATESTAMP_RFC2822:time}", "Temp: %{NUMBER:temp}", "Light: %{NUMBER:light}", "Room: %{USERNAME:room}" ] } }

}

output {

elasticsearch {
protocol => "http"
}

stdout {}

}

Any suggestions?

If you would post copy/pasteable text instead of screenshots it would be easier to help you. While debugging i strongly recommend that you use a simple stdout { codec => rubydebug } output. Adding ES and Kibana prematurely is a source of confusion and a general waste of time.

But as the _grokparsefailure tag indicates your grok expression isn't working, and comparing your expression to the original message it's easy to see why. If we for a moment pretend that DATESTAMP_RFC2822 matches your timestamp you'd want something like

%{DATESTAMP_RFC2822:time} %{NUMBER:temp} %{NUMBER:light} %{GREEDYDATA:room}

but before that actually works you have to find an expression that matches your timestamp. DATESTAMP_RFC2822 is close but the timezone and the comma throws is off. Perhaps

%{DAY} %{MONTH:month} %{MONTHDAY:monthday} %{YEAR:year} %{TIME:time} GMT(?<tzoffset>[+-]\d\d:\d\d) \([^)]+\)

would work? Afterwards you'll have to assemble the separate fields to a single field that you can feed to the date filter. Or maybe the whole timestamp would be acceptable to the date filter, come to think of it.

Moving this to #logstash.

Still showing grok parse failure.

My filter is -

filter {
grok { match => { "message" => "%{DAY} %{MONTH:month} %{MONTHDAY:monthday} %{YEAR:year} %{TIME:time} GMT(?[+-]\d\d:\d\d) ([^)]+)"} }

}

Sample log -

Mon Feb 22 2016 14:56:38 GMT+0530 (IST) 24.519828605774933 7.023411371237459 SERVER ROOM

Please help!

Update : The grok filter is the following -

filter {
grok { match => { "message" => "%{DAY} %{MONTH:month} %{MONTHDAY:monthday} %{YEAR:year} %{TIME:time} GMT(?[+-]\d\d:\d\d) ([^)]+) %{NUMBER:temp} %{NUMBER:light} %{GREEDYDATA:room}"} }

}

Still no improvement.

The issue has been resolved. I had to add a '/s' at the end of every parsed parameter in the grok syntax which accounts for a whitespace.

Attached the screenshot of the Kibana 'discover' tab of these logs.

Can you please share the corrected grok syntax.