I have a CSV file I am ingesting. The 'timestamp' field shows the time like this: 2024-03-14 09:30:58.000.
In Logstash my Date Filter is show below. When the ingest runs the elastic index shows the timestamp field as text and not a proper timestamp field. Can someone tell me what is wrong here?
date {
match => ["Timestamp","YYYY-MM-dd HH:mm:ss"]
}
I have tried the yyyy in both upper and lowercase, neither work.
This is related to the mapping of the field, you need to adjuste the mapping of the Timestamp field in your template to be a date type.
This filter would parse your Timestamp field using the specified format and store it in the @timestamp field, which is the default target, but the issue here is that the format is wrong, it does not match your date string.
Your date string is this: 2024-03-14 09:30:58.000, but your format will not match because it is missing the miliseconds, it should be YYYY-MM-dd HH:mm:ss.SSS
Are you suggesting this config? I had this mutate convert filter earlier, but logstash error'd out. It said the timestamp convert needed to be string, boolean, or keyword.
}
mutate {
convert => { "timestamp" => "date"
}
}
date {
match => ["timestamp","YYYY-MM-dd HH:mm:ss.SSS"]
}
No, the mapping is done in Elasticsearch side, not on Logstash, if you didn't specify a mapping for your index while creating it or using an index template, Elasticsearch will try to infer the type of the field, but sometimes it does not infer the correct type, which seems to be your case.
You need to change the type in elasticsearch by recreating the index with the correct mapping or using an index template and then create a new index.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.