Multiple Grok Filters

Hey there i got a question about my grok filter...
I'm dealing with Logfiles which are looking like this:

2017-05-23 14:41:09 DEBUG [09-exec-38] PreparedStatement (27 ) - {pstm-100637} Executing Statement: SELECT wert1 AS value FROM z_var WHERE varnr=?

They are comitted by filebeat und send to elasticsearch after filtering.
My Logstash Config File looks like this and the Grok Filter works fine at the moment:

input {
  beats {
	port => 5044
	}
}
 
filter{
grok {
   match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} \[%{DATA:Thread}\] %{DATA:Method} \(%{DATA:Line}\) \- %{GREEDYDATA:logmsg}"}

 }
}
 
output {
  elasticsearch {
	hosts => "localhost:9200"
	manage_template => false
	index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
	document_type => "%{[@metadata][type]}"
  }
  stdout { codec => rubydebug }
}

Now I'm dealing with 2 issues:

  1. Sometimes in the Message Area of the Logfile there could be something like this when a student tried to login in to the managed platform: "Login 52632". Is it possible to only create a field like "Matrikelnummer" or "Login" if there is a match in the given event? Like you can see in the Filter I'm already matching the whole Message area (greedy data..).

  2. When i want to watch the events in Kibana there is as a timestamp the timestamp which is created by filebeat as the index. The timestamp which i am filtering out of the original event is only mapped as a string. How can i change this to be of the type "date" so i can analyse or work with the timestamp?

I hope youre understanding my problems, and i'm sorry for the bad english!
Best regards from germany

Is it possible to only create a field like "Matrikelnummer" or "Login" if there is a match in the given event?

You can use a second grok filter that matches against the logmsg field. You'll want to set tag_on_failure to an empty list to avoid getting a _grokparsefailure tag on many or most events.

When i want to watch the events in Kibana there is as a timestamp the timestamp which is created by filebeat as the index. The timestamp which i am filtering out of the original event is only mapped as a string. How can i change this to be of the type "date" so i can analyse or work with the timestamp?

Use a date filter to parse the timestamp field and store it in the @timestamp field. The @timestamp field should always be a date field.

Thanks for the fast response. At first i tried to solve the timestamp thing but unfortunately now I'm getting a _grokparsefailure tag on each event. Here my filter config:

filter{
grok {
   match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} \[%{DATA:Thread}\] %{DATA:Method} \(%{DATA:Line}\) \- %{GREEDYDATA:logmsg}"}
 }
date {
   match=>["timestamp", "yyyy-MM-dd HH:mm:ss"]
   target=>"@timestamp"
 }
}

Shouldnt it be a _dateparsefailure? If i remove the date filter again everything works fine. I'm a bit confused

Date filters never add _grokparsefailure tags so I don't know what's up. I suggest you verify that you're really using the configuration you think you're using. Do you have any extra files in /etc/logstash/conf.d, for example? Logstash reads all files.

No there's only one config File in conf.d...This behaviour is very strange, i thought there could be a Syntax error because of wrong brackets but i can't figure out what. The Date filter sould be inside the filter section and parallel to the Grok filter..
Could there be a error because i use different timestamp syntax? TIMESTAMP_ISO8601 versus yyyy-MM-dd HH:mm:ss ? But technically it should be the same..

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.