Dataparse failures

Hi I'm getting dataparse failure errors because of invalid timestamp format:

fortigate_subtype:%{subtype} host:192.168.1.1 logTimestamp:%{date} %{time} @version:1 syslog5424_pri:188 fortigate_type:syslog message:%COPY-W-TRAP: The mirror-config file is illegal due to failure of previous copy operation/s to mirror-config. tags:_dateparsefailure syslog_index:<188> @timestamp:July 8th 2018, 11:39:14.487 loglevel:%{level} _id:_P1CeWQB1oi6Il_pcDdF _type:doc _index:logstash-syslog _score: -

This is my filter:

input {
udp {
port => 5514
type => syslog
}

tcp {
port => 5514
type => syslog
}
}

filter {

            grok {
                    patterns_dir => ["/etc/logstash/patterns"]
                    match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
                    overwrite => [ "message" ]
                    tag_on_failure => [ "failure_grok_fortigate" ]
            }

            kv { }

            if [msg] {
                    mutate {
                            replace => [ "message", "%{msg}" ]
                    }
            }

            mutate {
                    add_field => ["logTimestamp", "%{date} %{time}"]
                    add_field => ["loglevel", "%{level}"]
                    replace => [ "fortigate_type", "%{type}"]
                    replace => [ "fortigate_subtype", "%{subtype}"]
                    remove_field => [ "msg", "type", "level", "date", "time" ]
            }
            date {
                    locale => "en"
                    match => ["logTimestamp", "YYYY-MM-dd HH:mm:ss"]
                    remove_field => ["logTimestamp", "year", "month", "day", "time", "date"]
                    add_field => ["type", "syslog"]
            }

}

output {
elasticsearch { hosts => ["localhost:9200"] index => "logstash-syslog" }
stdout { codec => rubydebug }
}

The logTimestamp field contains "%{date} %{time}" which obviously can't be parsed. This indicates that there were no date and time fields when you created the logTimestamp field. When were those two fields supposed to be created?

Thanks for your answer. I think %date and time is based on an assumption these are the right fields. But basically I'm looking for the right syntax to have a date and time parsed.

There's no reason to make any assumptions when it's so easy to observe reality. Use a stdout { codec => rubydebug } output to make Logstash dump the raw events it processes. If you do that I'm pretty sure you'll note that you don't have any of the fields you think you have. The only thing your grok filter does is strip the syslog priority prefix string from the syslog payload. You're not extracting the timestamp or anything else into fields. The Logstash documentation has a page with a couple of complete configuration examples, one of them of syslog parsing.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.