Hi,
For some reason logstash can't parse a field called timestamp in my json events in some environments (on other environments it works fine)
This is the error I see in logstash logs:
First thing I though was different mappings but the mappings are the same for all environments.
Any idea what can cause this sporadic behavior? And what is the meaning of this error?
I see, checking again I see that I really don't have any mapping for a field called "timestamp". In another environment where I see no exceptions I do see this in my mapping:
"timestamp":{"type":"string","norms":{"enabled":false}
Could this be the issue? My index templates in both environments are the same, so how could this happen?
Unless you list timestamp in your index template (and you haven't disabled the automapper) ES will automatically try to map the field. It's weird that it complains about this field if it doesn't have a mapping.
Do you really want to keep the timestamp field? It looks like a temporary field where you've stored the timestamp from the log so that you can parse it with the date filter. If so I'd just delete the field. If you want to keep it, how about adding an explicit mapping in your index template? You can make it a date field and configure it to support your date format.
I think i'll add a line to match "timestamp" as well, and remove it afterwards.
It's just weird to me that on one environment I don't get these exceptions and on another one I do.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.