Hi,
We are collecting a large amount of logs through out the day and then sending them at 4:30am via Filebeat in a bulk send. The problem is when they end up in Elasticsearch/Kibana all the @TimeStamp fields are 4:30am even thought the logs have been collected at different times throughout the day. How do I set the @TimeStamp Field to what's actually in the message field.
So I want to use ""message":"03-MAR-21 02:19:42|" as the @TimeStamp instead of 4:30am.
In Filebeat you could do it using a timestamp processor (in beta), in logstash you would use a date filter.
I think I am close but when I add the below I get date
filter
{
grok {
match => { "message" => [ "%{GREEDYDATA:timestamp}\|%{IPV4:src.ip}\|%{WORD:event.id}\|%{GREEDYDATA:dn}\|%{NOTSPACE:session_id}\|%{GREEDYDATA:application_name}" ] }
}
date {
match => [ "timestamp", "MMM dd yyyy HH:mm:ss", "MMM d yyyy HH:mm:ss", "ISO8601" ]
target => "@timestamp"
}
So the Filebeat always comes in as this "@timestamp":"2021-03-04T17:17:56.133Z", but I want to use what's in the message filed a"message":"03-MAR-21 03:46:10" as the @TimeStamp.
but I get ""tags":["_dateparsefailure"]
ISO8601 uses a CasualISO8601Parser and that defers to joda for what ISO8601 means. joda does not agree that ' 03-MAR-21 03:46:10" is ISO8601 compliant.
Just change your date filter to have the actual format
date { match => [ "timestamp", "dd-MMM-yy HH:mm:ss" ] }
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.