_bulk occasionally failing, not giving me enough info

I am importing data from a csv -> logstash -> elasticsearch and about 1% of my data ends up vanishing.

Logstash doesn't have anything showing up in the logs.

However elasticsearch is showing this error for each missing piece of data.

[2017-08-09T13:31:17,720][DEBUG][o.e.a.b.TransportShardBulkAction] [z61oeaJ] [property][1] failed to execute bulk item (update) BulkShardRequest [[property][1]] containing [27] requests
org.elasticsearch.index.mapper.MapperParsingException: failed to parse
    at.org.elasticsearch...
    ...
    ...
Caused by: java.lang.IllegalArgumentException: Malformed content, found extra data after parsing: FIELD_NAME

I later added some code to remove fields with null values and last line of my error switched to.

Caused by: java.lang.IllegalArgumentException: Malformed content, found extra data after parsing: END_OBJECT

I tried upping my logging levels... but I can't seem to get any more information than this. Is there any way to know which field is the problem?

Have you looked at the Logstash logs?

Yes, they didn't have any information.

After digging around though the data,
I found that the dropped documents all had geo_points that had null lat, lon values. Every other parsing error I encountered was a lot more descriptive.

Please do create an issues on github so we can make the error more human friendly :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.