Grok parse failure syslog

Hi there, I've setup syslog forwarding on my firewall. And I'm getting this error:

fortigate_subtype:%{subtype} host:192.168.100.100 logTimestamp:%{date} %{time} @version:1 syslog5424_pri:188 fortigate_type:syslog message:%COPY-W-TRAP: The mirror-config file is illegal due to failure of previous copy operation/s to mirror-config. tags:_dateparsefailure syslog_index:<188> @timestamp:September 26th 2018, 03:36:21.257 loglevel:%{level} _id:AySFE2Y

I tried to filter as much as possible with tools like herokuapp and such but I got stuck and is now is driving me crazy :slight_smile:

Offcourse config :slight_smile:

input {

udp {
port => 5514
type => syslog
}

tcp {
port => 5514
type => syslog
}
}

filter {

            grok {
                    patterns_dir => ["/etc/logstash/patterns"]
                    match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
                    overwrite => [ "message" ]
                    tag_on_failure => [ "failure_grok_fortigate" ]
            }

            kv { }

            if [msg] {
                    mutate {
                            replace => [ "message", "%{msg}" ]
                    }
            }

            mutate {
                    add_field => ["logTimestamp", "%{date} %{time}"]
                    add_field => ["loglevel", "%{level}"]
                    replace => [ "fortigate_type", "%{type}"]
                    replace => [ "fortigate_subtype", "%{subtype}"]
                    remove_field => [ "msg", "type", "level", "date", "time" ]
            }
            date {
                    locale => "en"
                    match => ["logTimestamp", "YYYY-MM-dd HH:mm:ss"]
                    remove_field => ["logTimestamp", "year", "month", "day", "time", "date"]
                    add_field => ["type", "syslog"]
            }

}

output {
elasticsearch { hosts => ["localhost:9200"] index => "logstash-syslog" }
stdout { codec => rubydebug }
}

And debug

[2018-09-26T00:00:00,204][DEBUG][logstash.filters.grok    ] Running grok filter {:event=>#<LogStash::Event:0x466511c5>}

[2018-09-26T00:00:00,204][DEBUG][logstash.filters.grok ] Event now: {:event=>#LogStash::Event:0x466511c5}
[2018-09-26T00:00:00,205][DEBUG][logstash.filters.grok ] Running grok filter {:event=>#LogStash::Event:0x3b12fa0a}
[2018-09-26T00:00:00,205][DEBUG][logstash.filters.grok ] Event now: {:event=>#LogStash::Event:0x3b12fa0a}
[2018-09-26T00:00:00,206][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"logTimestamp", "value"=>["%{date} %{time}"]}
[2018-09-26T00:00:00,206][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"loglevel", "value"=>["%{level}"]}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"msg"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"type"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"level"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"date"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"time"}
[2018-09-26T00:00:00,206][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"logTimestamp", "value"=>["%{date} %{time}"]}
[2018-09-26T00:00:00,206][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"loglevel", "value"=>["%{level}"]}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"msg"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"type"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field
{:field=>"level
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"date"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"time"}
[2018-09-26T00:00:00,206][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::date: adding value to field {"field"=>"type", "value"=>["syslog"]}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"logTimestamp"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"year"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"month"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"day"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"time"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"date"}
[2018-09-26T00:00:00,206][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::date: adding value to field {"field"=>"type", "value"=>["syslog"]}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"logTimestamp"}
[2018-09-26T00:00:00,206][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"year"}
[2018-09-26T00:00:00,207][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"month"}
[2018-09-26T00:00:00,207][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"day"}
[2018-09-26T00:00:00,207][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"time"}
[2018-09-26T00:00:00,207][DEBUG][logstash.filters.date ] filters/LogStash::Filters::date: removing field {:field=>"date"}

I presume "/etc/logstash/patterns" holds custom patterns, if so what are they?

dir is empty..

Then the pattern you are using SYSLOG5424PRI will only pull out the syslog5424_pri field.

None of the fields like date time level subtype are not grokked out.

Yes, hmmm do how do I get these then? Adding the patterns? What
should be the syntax?

Try using SYSLOG5424LINE as in:
match => ["message", "%{SYSLOG5424LINE}"]

It is defined as:
SYSLOG5424LINE %{SYSLOG5424BASE} +%{GREEDYDATA:syslog5424_msg}

You can see all the patterns via the vendor folder in your LS installation... vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns/linux-syslog

Thanks, I modified this line:

match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]

Into :

  `match => ["message", "%{SYSLOG5424LINE}"]`

Next I copied the patterns "linux-syslog" "firewalls" files from vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns/ to

/etc/logstash/conf.d/logstash.syslog.conf file and restarted the logstash service.

Now I'm getting this error message in logstash-plain.log:

[2018-09-26T21:47:54,525][INFO ][logstash.outputs.elasticsearch] retrying failed action with response code: 403 ({"type"=>"cluster_block_exception", "reason"=>"blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];"})

and in Kibana:

fortigate_subtype:

%{subtype}
host:
192.168.1.1
logTimestamp:
%{date} %{time}
@version:
1
syslog5424_pri:
188
fortigate_type:
syslog
message:
%COPY-W-TRAP: The mirror-config file is illegal due to failure of previous copy operation/s to mirror-config.
tags:
_dateparsefailure
syslog_index:
<188>
@timestamp:
September 26th 2018, 03:36:21.257
loglevel:
%{level}
_id:
AySFE2Y

Fixed the logstash problem, was storage. And got the msg! :

    poluuid:b474b1b8-fbcd-51e7-eda8-2ea24b836fcd dstintf:internal3 tranip:192.168.100.150 tranport:8000 sentpkt:6 fortigate_type:traffic srcport:64130 logTimestamp:%{date} 13:18:08 loglevel:notice devid:FGT400C12345678
service:TCP_8000 appcat:unscanned <189>date:2018-09-26 srcip:62.163.43.53 trandisp:dnat subtype:forward sentbyte:1179 policyid:5 @timestamp:Sep

Many thanks!
Now ok, this is almost readable, how can I tidy up the date and timestamp?

You don't need to copy the patterns - the grok filter knows how to find them as they are the standard pattern set.

That grok pattern SYSLOG5424LINE is expecting a TIMESTAMP_ISO8601 timestamp format which is defined as:
TIMESTAMP_ISO8601 %{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?

This should appear as the field syslog5424_ts.
Please post a scrubbed example of your incoming data.

Hi there, sorry for the delay, this should be the raw data coming in:

1: date=2018-10-17 time=03:19:48 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=52120 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22161176 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

2: date=2018-10-17 time=03:14:48 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=49390 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160929 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

3: date=2018-10-17 time=03:13:28 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.160 srcport=35729 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160852 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

4: date=2018-10-17 time=03:09:47 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=33107 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160715 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

5: date=2018-10-17 time=03:04:47 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=47123 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160462 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

6: date=2018-10-17 time=02:59:48 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=40293 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160240 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

7: date=2018-10-17 time=02:58:00 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.10 srcport=47648 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22160098 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

8: date=2018-10-17 time=02:54:48 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=51858 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22159984 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

9: date=2018-10-17 time=02:49:47 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.150 srcport=49577 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22159743 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

10: date=2018-10-17 time=02:45:58 logid=0000000011 type=traffic subtype=forward level=warning vd=root srcip=10.10.1.160 srcport=60164 srcintf="internal3" dstip=8.8.8.8 dstport=53 dstintf="wan1" poluuid=45245042-ccf9-51e6-28f5-f8fa6131fcdf sessionid=22159509 proto=17 action=dns policyid=1 appcat="unscanned" crscore=5 craction=262144 crlevel=low

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.