I mess arround with my csv and a date field formatted as "DD.MM.YYYY"
But whatever I do Logstash Filter will ignore all --> still "String"
Even the data_formatted plug in will not work as I expected.
What I want is this: but instead of @timestamp as Date I want the csv Date as "Date" --> but no way and I have no ideas left!
Why the ELK Stack have so many problems / troubles by managing the most obvious informations like Date and Time?
Why this so hard to do the date {} Filter in Logstash?
This must be a simple task, but it's not!
Note that once a mapping is established in elasticsearch, such as Date being a string, it cannot be changed without creating a new index. So when you are debugging things like this you need to keep deleting the index as you try each iteration in logstash.
Thats what I did before try yours!
Clean all ES and Filebeat
The field Date still mapped as a String.
Believe me, I'm getting frustraded and I dont know why the hell this is not working.
I tried everything I found here in this discuss forum even the plugin date_formatted which will give
access to change the pattern of a field.
Nothing works.
My wish: An Optional "Auto-Type" for fields they look like what they probably are no matter from what format
"Date is Date" "Time is Time" "Number is Number" etc.
Finaly Made it!
I got some syntax errors while my added fields.
Your advice works well
However I hope there will be an AutoType for Fields in future updates.
That will do things easier when working with bunch of different files.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.