Logstash - TimeStamp Normalization

Hi Everyone,
Versions to get it out of the way, and I am not sure how to write this question up..
Wazuh Manager: 3.4.0
Filebeat and Logstash: 6.3.2
Elasticsearch 6.3.2

I have loaded the wazuh templates as per the install guide and have stumbled on a strange issue.... I think I am having and issue with the logstash config in regards to date.

Basically the in the normal running, the ossec agent is sending items with this as a time stamp:
"predecoder":{"program_name":"WinEvtLog","timestamp":"2018 Aug 10 11:08:45"}
Everything was working correctly...

I then added the our "Cisco ASA" to the system and things started to break. This is what the ASA is sending
"predecoder":{"timestamp":"2018-08-10T11:57:07-04:00"}

This is the error that i am getting from logstash. (I know that elasticsearch is rejecting the message)

[2018-08-15T10:00:43,919][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2018.08.15", :_type=>"wazuh", :_routing=>nil}, #LogStash::Event:0x3760052b], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2018.08.15", "_type"=>"wazuh", "_id"=>"P1njPWUBsE0Dsfl4fsp3", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [predecoder.timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018 Aug 15 09:00:42" is malformed at " Aug 15 09:00:42""}}}}}

What I am seeing..
indexs are being created daily using the wazuh template and it would seem that if the ASA is the first one to send a log item the index gets create useing the ASA time stamp of "2018-08-10T11:57:07-04:00" The the system cant accept date that have the name of the month in it "AUG". But if the winevnt comes in first, but dates are accepted.

SO My question is this
In the normal logstash config there is a this item
date {
match => ["timestamp", "ISO8601"]
target => "@timestamp"
}

This is only for the ASA date, but I don't understand how the winevt is working.

Can you provide me some Guidance on how i can deal with these two dates?
thanks
Corey

Either

  • use a date filter to parse the [predecoder][timestamp] field into @timestamp and delete [predecoder][timestamp] afterwards or
  • use a date filter to parse the [predecoder][timestamp] field and store the result back to [predecoder][timestamp].

Thanks for the reply, and please forgive my ignorance :")
I think I should have include my entire filter section

filter {
geoip {
source => "@src_ip"
target => "GeoLocation"
fields => ["city_name", "country_name", "region_name", "location"]
}
date {
match => ["timestamp", "ISO8601"]
target => "@timestamp"
}
mutate {
remove_field => [ "timestamp", "beat", "input_type", "tags", "count", "@version", "log", "offset", "type", "@src_ip", "host"]
}
}

In the remove_field, I tried to remove [predecoder][timestamp] by changing it to
remove_field => [ "predecoder.timestamp", "beat", "input_type", "tags", "count", "@version", "log", "offset", "type", "@src_ip", "host"]
But I am clearly missing something.

How should this be worded..

Thanks again for the support!

remove_field => [ "predecoder.timestamp", "beat", "input_type", "tags", "count", "@version", "log", "offset", "type", "@src_ip", "host"]

Change to:

remove_field => [ "[predecoder][timestamp]", "beat", "input_type", "tags", "count", "@version", "log", "offset", "type", "@src_ip", "host"]

The syntax for referencing nested fields is described here: Accessing event data and fields | Logstash Reference [8.11] | Elastic

Thanks you for the response as that was a BIG part of what i was missing.
I did get it to remove the predecode.timestamp with the information you provided. I am going to now try your second suggestion and see if I can normalize the date on my own...

Corey

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.