Hello
i have a field named "message", it contains the string:
"GET1: +992545989898 GET2: 365981022808151-821eaec1d4b27cc974c97e11a2343f705c0c4612-1492310893103"
The field contains Phone-number, ID, HEX-data, and epoch (UNIX_MS) at the end,
some how the m_date field is added, but recognized as string instead of date
i want also to replcae the @timestamp with this m_date field
Thanks
this is my filter:
filter {
grok {
match => {
"message" => "GET1:\s\++(?<m_num>\d+)\sGET2:\s(?<m_id>\d+)-[a-fA-F\d]+-(?<m_date>\d+)"
}
}
date {
match => ["m_date","UNIX_MS","UNIX"]
timezone => "UTC"
}
}
Do you really need to keep m_date if you store the timestamp in @timestamp?
No, but how to convert, how to store 'm_date' as '@timestamp' ?
If the date filter can't parse a string it'll tell you why in the Logstash log.
That's the weird part , I see it on kibana as string, but there is no error in /var/log/logstash/logstash-plain.log
it looks like it is ignoring it, or override it...
Can I define the match from 'message' field inside the 'date' without using grok.
Or maybe there is another way (couldn't do it using console with (painless).
Maybe you can point me to other options i should check
Thank you very much
PS
Can i change string from epoch time to date on kibana
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.