Still problems with grok parsing

Hi there,

I've still problems parsing with grok. This is the raw data coming in:

1: date=2018-10-17 time=03:19:48 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=52120 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22161176 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

2: date=2018-10-17 time=03:14:48 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=49390 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160929 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

3: date=2018-10-17 time=03:13:28 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.160 srcport=35729 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160852 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

4: date=2018-10-17 time=03:09:47 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=33107 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160715 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

5: date=2018-10-17 time=03:04:47 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=47123 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160462 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

6: date=2018-10-17 time=02:59:48 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=40293 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160240 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

7: date=2018-10-17 time=02:58:00 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.10 srcport=47648 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160098 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

8: date=2018-10-17 time=02:54:48 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=51858 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22159984 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

9: date=2018-10-17 time=02:49:47 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=49577 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22159743 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

10: date=2018-10-17 time=02:45:58 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.160 srcport=60164 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22159509 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

These are the error messages:

failure_grok_fortigate, _dateparsefailure
host:

I would recommend using a kv filter instead of grok if all your logs follow the same format.

Thanks for your answer! What would be the kv filter looking at this raw data?
I now have this configured:

input {
  udp {
port => 5514
type => syslog
  }

  tcp {
port => 5514
type => syslog
  }
}


filter {




            grok {
                    patterns_dir => ["/etc/logstash/patterns"]
                    match => ["message", "%{SYSLOG5424LINE}"]
                    overwrite => [ "message" ]
                    tag_on_failure => [ "failure_grok_fortigate" ]
            }

            kv { }

            if [msg] {
                    mutate {
                            replace => [ "message", "%{msg}" ]
                    }
            }

            mutate {
                    add_field => ["logTimestamp", "%{date} %{time}"]
                    add_field => ["loglevel", "%{level}"]
                    replace => [ "fortigate_type", "%{type}"]
                    replace => [ "fortigate_subtype", "%{subtype}"]
                    remove_field => [ "msg", "type", "level", "date", "time" ]
            }
            date {
                    locale => "en"
                    match => ["logTimestamp", "YYYY-MM-dd HH:mm:ss"]
                    remove_field => ["logTimestamp", "year", "month", "day", "time", "date"]
                    add_field => ["type", "syslog"]
            }
}

    output {
      elasticsearch { hosts => ["localhost:9200"] index => "logstash-syslog" }
      stdout { codec => rubydebug }
      }

Eh anyone?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.