Suricata timestamp problem

Hey all,

I am sendling suricata json to logstash and logstash is having problems with the timestamp format:

{:timestamp=>"2016-08-02T18:28:37.516000+0200", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"2016-08-02T18:28:36.001531+0200", :exception=>"Invalid format: "2016-08-02T18:28:36.001531+0200" is malformed at "-08-02T18:28:36.001531+0200"", :config_parsers=>"YYYY MMM dd HH:mm:ss", :config_locale=>"default=en_US", :level=>:warn}

my logstash config for suricata is as follows (just added the timezone option as a last ditch effort which didn't work):

input {
file {
path => ["/var/log/suricata/eve.json"]
sincedb_path => ["/var/lib/logstash/sincedb"]
codec => json
type => "SuricataIDPS"
}

}

filter {
if [type] == "SuricataIDPS" {
date {
match => [ "timestamp", "yyyy-MM-dd'T'HH:mm:ss.SSS" ]
timezone => "Europe/Berlin"
}
ruby {
code => "if event['event_type'] == 'fileinfo'; event['fileinfo']['type']=event['fileinfo']['magic'].to_s.split(',')[0]; end;"
}
}

if [src_ip] {
geoip {
source => "src_ip"
target => "geoip"
#database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
if ![geoip.ip] {
if [dest_ip] {
geoip {
source => "dest_ip"
target => "geoip"
#database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
}
}
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
#protocol => http
}
}

Thanks for the help,
Chuck

Does not match;

You may want to add more S.

Sorry,

forgot to mention I also tried the following

yyyy-MM-dd'T'HH:mm:ss.SSSSSS
yyyy-MM-dd'T'HH:mm:ss.SSSZ
yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ

All to no avail. but what I am wondering about mostly is the:

config_parsers=>"YYYY MMM dd HH:mm:ss"

entry in the logstash log. This entry looks NOTHING like what I am trying to match with and am not sure why it is even listed.

So,

After testing everything by starting logstash with only the suricata config and accepting input from stdin then piping the contents of the json log into logstash I can verify that the suricata config is correct using the match

  match => [ "timestamp", "YYYY-MM-dd'T'HH:mm:ss.SSSSSSZ" ]

There is a problem when starting with the normal config and including the suricata config where logstash tries to match against another pattern instead of the pattern in the suricata config even though the type is set to suricata when reading in the json logs.

1 Like