JSON Index Error: same field names different datatypes

Hi all,
I'm indexing some application logs that contain json in the message body.
In the logstash config file, I extract the JSON and use the json plugin to convert the string in a object. All works fine, but some fields could have the same name but different datatypes (some are strings, other are json objects). In this case the logs are not indexed and the error is "mapper_parsing_exception".

Is there a way to ignore this error and index logs anyway?

Thanks.
KR.

Every indexed field must be mapped to a single data type in Elasticsearch, so the data you are describing can not be indexed into Elasticsearch. I suspect declaring the field to not be indexed might allow them to be indexed, but you would need to know which fields are affected upfront and would not be able to search or aggregate on them.

Hi Christian, thank you for your help. As I thought, there’s no a way to index it, but can I catch that event after the failure, in order to not to lose the log? Now I index the logstash logs which contain also the lost event, but I hope there is a different and elegant way to handle that.

Thank you.

At the moment there is as far as I know no way of catching events that fail like this, but the Logstash team are working on introducing a dead letter queue that would be able to catch events like this. If there are specific fields that are causing this problem you could check as part of the pipeline, but I can not think of a generic solution.

Ok, thank you so much for your time :slight_smile:
Hope to see the DLQ available soon.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.