Hi,
i'm new to elk, so it may be a layer 8 problem, but i'm not able to fix it. So i hope here's somebody able to help me.
So currently the date-filter in my logstash config is not doing what i expect. I import csv Files with some date-fields in it. These have no timezone, so i added the date-filter like this:
date {
locale => "de"
match => ["Start", "dd.MM.yyyy HH:mm:ss"]
timezone => "Europe/Berlin"
}
This is how the field looks like in the csv-file:
"02.05.2017 07:46:49"
After running logstash, this is what i get in Elasticsearch:
"Start": [
1493711209000
],
which is in "human language" "Tue, 02 May 2017 07:46:49 GMT". But after running through my date-filter, it shoult be "Tue, 02 May 2017 07:46:49 GMT+2:00", or am i wrong?
That's probably because the Start field has been mapped a particular way, and suddenly the data you're sending doesn't match. Can you just delete the index and start over?
Index patterns in Kibana? No, you can leave them. When you get this error, what do the mappings of the index look like? Specifically the Start field. Use ES's get mapping API.
Okay, so something resulted in that mapping of the Start field. Logstash doesn't do this out of the box and if your date filter works that's not what the Start field of your events will look like. Check again that you really have deleted the index so you're starting fresh and that the date filter always works.
What if you create the index by hand? What do the mappings look like for the newly created empty index? There shouldn't be any Start mapping. What if you insert a document with a Start field by hand?
No, don't insert "02.05.2017 07:46:49" in the Start field. That'll cause the automapper to pick the wrong format. Delete the index again. Recreate it. Check the mappings. Insert a document with an ISO8601 date (like "2017-05-01T06:34:52.000Z") in the Start field. Does that work?
Okay, good. Then the problem must be that the first document you insert with Elasticsearch still has the non-ISO8601 date format, forcing ES's automapper to map the field differently. When a subsequent document with an ISO8601 date arrives it doesn't match the mapping of the field.
All date fields in documents i have in the csv fileshave the same format ("02.05.2017 07:46:49").
Could the problem be caused by the csv headline?
Edit: Seems not to. I deleted the headline and have the same result. Here's an example of the csv file i'm using, the second field is the Start-Field: "12345";"30.04.2017 17:28:09";"01.05.2017 03:25:36";"111.111.111.110";"de";"NSPlayer/12.0.7601.17514";"35847"
But if i understand correctly, the problem should occure if i don't use the date filter too? That's not the case.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.