Parsing Syslog with Logstash Grock Filter isn't working with Kibana

Hi,

I have crated a very basic grok filter to parse Cisco Syslogs:

input {
  udp {
    port => 5140
    type => syslog
      }
 }

filter {
    grok {
        match => { "message"=> "%{TIMESTAMP_ISO8601:Timestamp_Local} %{IPV4:IP_Address} %{GREEDYDATA:Event}" }
         }
       }

output {
    elasticsearch {
      hosts => ["localhost:9200"]
      sniffing => true
      index => "ciscologs-%{+YYYY.MM.dd}"
       }

 }

After reloading Logstash and verifying that logs show no major issues I reloaded Kibana and refreshed indexes.

When accessing the Discovery section, I saw that the index was indeed created. But looking at the fields, they were the default ones and not the ones defined in the grok filter.

The logs received after adding the filter show the following tag in Kibana:

Before adding the filter I made sure it works using Kibana's Grok debugger.
The tag states that there was a problem with the logs parsing but at this point I'm not sure where the issue might be, so, any help would be appreciated.

You specify an IPV4 in the grok filter, but your log doesn't contain an IP address.
You also specify TIMESTAMP_ISO8601, but your log doesn't contain an ISO 8601 timestamp.

Thanks, in fact I found the problem. I was trying to match the Syslog coming from the router and not the logs in "message" After modifying that, the filter worked just fine

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.