ES 6.4.2 - MapperParsingException when performing Bulk requests

Hi! I have been experiencing a MapperParsingException when performing Bulk requests on the Elastic Search 6.4.2 instance. This issue is happening on all the environments (development, qa, production).

The weird thing is that only happens when the Bulk request is performed through Java. I have tried to manually performed the Bulk Request using CURL and using the exact same data but the issue is not triggered.

This is the exact output I am getting:

[o.e.a.b.TransportShardBulkAction] [conferenceindex][1] failed to execute bulk item (index) BulkShardRequest [[conferenceindex][1]] containing [index {[conferenceindex][doc][uIgtIoMBqtH6okL1o17R], source[n/a, actual length: [2.7kb], max length: 2kb]}]
org.elasticsearch.index.mapper.MapperParsingException: failed to parse
	at org.elasticsearch.index.mapper.DocumentParser.wrapInMapperParsingException(DocumentParser.java:171) ~[elasticsearch-6.4.2.jar:6.4.2]
	at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:72) ~[elasticsearch-6.4.2.jar:6.4.2]
	at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:263) ~[elasticsearch-6.4.2.jar:6.4.2]
	at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:725) ~[elasticsearch-6.4.2.jar:6.4.2]
	at org.elasticsearch.index.shard.IndexShard.applyIndexOperation(IndexShard.java:702) ~[elasticsearch-6.4.2.jar:6.4.2]
	at org.elasticsearch.index.shard.IndexShard.applyIndexOperationOnPrimary(IndexShard.java:682) ~[elasticsearch-6.4.2.jar:6.4.2]
	at org.elasticsearch.action.bulk.TransportShardBulkAction.lambda$executeIndexRequestOnPrimary$2(TransportShardBulkAction.java:560) ~[elasticsearch-6.4.2.jar:6.4.2]

It says there is a MapperParsingException but it does not say what exactly is failing to parse. Plus the amount of data in some cases are just less than 30 records which does not make sense.

Could you please help me with some thoughts on this?

To add more details related to the issue, this issue does not happen locally it only happens on the server. Also I have a very small json data that I use to perform some manual bulk requests:

{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"1"}}
{"id":"1","name":"Conference 1" }
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"2"}}
{"id":"2","name":"Conference 2" }
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"3"}}
{"id":"3","name":"Conference 3", "tenant": 1}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"4"}}
{"id":"4","name":"Conference 4"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"5"}}
{"id":"5","name":"Conference 5"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"6"}}
{"id":"6","name":"Conference 6"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"7"}}
{"id":"7","name":"Conference 7"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"8"}}
{"id":"8","name":"Conference 8"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"9"}}
{"id":"9","name":"Conference 9"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"10"}}
{"id":"10","name":"Conference 10"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"11"}}
{"id":"11","name":"Conference 11"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"12"}}
{"id":"12","name":"Conference 12"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"13"}}
{"id":"13","name":"Conference 13"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"14"}}
{"id":"14","name":"Conference 14"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"15"}}
{"id":"15","name":"Conference 15"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"16"}}
{"id":"16","name":"Conference 16"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"17"}}
{"id":"17","name":"Conference 17"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"18"}}
{"id":"18","name":"Conference 18"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"19"}}
{"id":"19","name":"Conference 19"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"20"}}
{"id":"20","name":"Conference 20"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"21"}}
{"id":"21","name":"Conference 21"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"22"}}
{"id":"22","name":"Conference 22"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"23"}}
{"id":"23","name":"Conference 23"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"24"}}
{"id":"24","name":"Conference 24"}
{"index":{"_index": "conferenceindex", "_type": "doc", "_id":"25"}}
{"id":"25","name":"Conference 25"}

The command I run to perform the manual bulk requests is:

curl -H 'Content-Type: application/x-ndjson' -XPOST '10.5.2.155:9200/conferenceindex/_bulk?pretty' --data-binary @/Users/Documents/conferenceindex_test.json

The data contained in the JSON is the exact same that is in the database and that throws the exception when running the Bulk through Java code.

Welcome to our community! :smiley:

Elasticsearch 6.4 is very much past EOL and is no longer supported. You need to upgrade as a matter of urgency.

What does the response to your client that is making the index request report?

Hi! Thanks for the reply. Yes we are making sure to upgrade the ES version in the short-mid term. In the meantime I have ruled out a few more things and I noticed that only 30-40% of the data to index is coming through from the main micro service.

Basically, the 30-40% that is coming through is getting properly indexed (the MapperParsingException: failed to parse is still triggered but the data is still indexed). I am troubleshooting right now to understand why the 100% data that is sent to be indexed is not coming through.

As for the response I am getting from the client to index the data I am not getting any issues and/or exceptions. The only exception I am getting is inside the ES server (MapperParsingException: failed to parse).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.