Logs not being sent if multiple fields in grok pattern

hi! I have the following filter which works fine:

filter {
  grok {
    match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp}" }
  }

  date {
    match => [ "syslog_timestamp",  "MMM  d HH:mm:ss", "MMM dd HH:mm:ss"]
    target => "@timestamp"
  }
}

But if I change the grok filter to this:

grok {
    match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_host} %{DATA:syslog_program}: %{GREEDYDATA:syslog_message}" }
  }

It does not work anymore, the logs are not injected into kibana anymore. Why does this happen? I checked the pattern in a Grok Debugger. These are some sample logs:

Feb  7 12:35:56 hostnamexxxxxxxxxx systemd: Created slice User Slice of root.
Feb  7 12:30:01 hostnamexxxxxxxxxx systemd: Started Session 9520 of user root.
Feb  7 12:12:59 hostnamexxxxxxxxxx systemd-logind: Removed session 3893.

What could be the issue here?

If logstash is processing events but they are not reaching elasticsearch then I cannot think of any case in which there would NOT be an error in the logstash logs.

I would start by using output { stdout { codec => rubydebug } } to verify then grok is working, then verify the events are reaching elasticsearch using the document counts, then figure out why the documents do not show up in your current Kibana search window.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.