Creating Logstash Filter using Grok plugin

I'm trying to build a grok filter for my logstash filter config.

This is a sample of a standard event I'm trying to tag.

[2016-01-05T23:59:23.352-0700] [glassfish 4.1] [INFO] [] [edu.utah.acs.student.classinformation.model.ClassScheduleBean] [tid: _ThreadID=113 _ThreadName=http-listener-2(26)] [timeMillis: 1452063563352] [levelValue: 800] [[
Attempting to pull classes for instructor with emplid 06004130 and terms (1158, 1164)]]

I want to create certain tags for parts of this log event.
Tags:
Timestamp: [2016-01-05T23:59:23.352-0700]
Server Version: [glassfish 4.1]
Event Level: [INFO]
[]
Message: [[edu.utah.acs.student.classinformation.model.ClassScheduleBean] [tid: _ThreadID=113 _ThreadName=http-listener-2(26)] [timeMillis: 1452063563352] [levelValue: 800] [[
Attempting to pull classes for instructor with emplid 06004130 and terms (1158, 1164)]]

What have you got so far?

Use "field". "tag" has a another meaning in the Logstash context.

I'm trying to use the grok debugger tool but I can only get the date_us pattern to work.

Here is what I have so far...

filter {
grok {
type => syslog
pattern => "%{DATE_US} %{WORD:Server_Version} %{WORD:SEVERITY} %{WORD:Unknown} %{WORD:application_name} %{WORD:message}"
}
}

That is to be applied to the following sample log event:

[2015-12-31T23:56:56.654-0700] [glassfish 4.1] [FINE] [] [edu.utah.acs.student.coupledapps.tuitionbill.ejb.TuitionBillServicesBean] [tid: _ThreadID=375 _ThreadName=http-listener-2(139)] [timeMillis: 1451631416654] [levelValue: 500] [CLASSNAME: edu.utah.acs.student.coupledapps.tuitionbill.ejb.TuitionBillServicesBean] [METHODNAME: getTuitionTotal] [[
Amount=0.0]]

  • "2015-12-31T23:56:56.654-0700" simply doesn't match DATE_US. Try TIMESTAMP_ISO8601 instead.
  • "glassfish 4.1" won't match WORD because it's two words and not one. I don't think "4.1" counts as WORD either because of the period.
  • There are lots of square brackets in the log message but it doesn't seem like you're including any of them in your expression. Don't forget that they need to be escaped with backslashes.

Here is my grok filter that is currently passing to elasticsearch.

My question now is how do I get those fields to populate into kibana. I don't see any of my fields I have matched for my message pattern.

filter {
if [type] == "redis" {
grok {
match => { "message" => "^[%{TIMESTAMP_ISO8601:timestamp}] [%{DATA:server_version}] [%{DATA:log_level}] [%{DATA:unknown}] [%{JAVACLASS:class}] [%{DATA:thread}] [%{DATA:category}] [%{DATA:loglevel}] [%{DATA:classname}] [%{DATA:methodname}] [[$" }

add_field => [ "received_at", "%{@timestamp}" ]

}

}
}

Never mind Kibana for now. Use a stdout { codec => rubydebug } output first to make sure your messages look as expected.

Please format your configuration as code (there's a toolbar button for it) so that we can see exactly what your configuration looks like.

While I don't think it's the cause of your problems, having more than one DATA or GREEDYDATA pattern in the same expression is asking for trouble.

So it is working now.

I had inputs/outputs in 3 different locations labeled differently on the log type config. Some where syslog, some were redis, and 1 was log.

I changed all of them to log.