I have some working timestamp detected as date in elasticsearch.
But can't make this one works. Grok is matching well, and logstash is parsing well the line: 2017-05-29 10:27:41,756 | soft-1 | user1 | | 10.x.x.x | INFO | intranet | Order 9979a662-b42a-414d-9d5c-5cxxxxxxxx creation TESTING request is finished
For some reason, [log][timestamp] appears as a string, even after deleting everything, or forcing to mutate to date.
How to troubleshoot that? Using logstash debug doesn't show me the resulting type of data (I tried in debug/verbose mode): output { stdout { codec => rubydebug } }
From this output, is there a way to see the format of the field log.timestamp? Only test I can do is to send to elastic and check the field type over there:
Seems like I am having problem (detected as a string) with this timestamp specifically "2017-05-29 10:27:41,756" and date { match => [ "[log][timestamp]", "yyyy-MM-dd HH:mm:ss,SSS" ] }
"2017-05-29 10:27:41Z" is detecting fine as a date withdate { match => [ "[log][timestamp]", "ISO8601" ] }
You didn't include the mapping of the timestamp field when you redacted the mappings.
The date filter worked just fine and was able to update the @timestamp field. If you expect it to update [log][timestamp] you need to configure it accordingly.
Why keep the fields under log? They appear to be duplicates of the corresponding root-level fields.
Sorry, I updated/edited the mapping on top.
I didn't write in advance the [log][timestamp] mapping, it was automatic.
I will remove this [log][timestamp] you are right, no need for duplication.
Why do you need mytimestamp to be a date? Your date filter (as configured) stores the result in @timestamp, a field you can't delete. Why not just use @timestamp and delete mytimestamp after the date filter?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.