Issue Upgrading from 2.3.1 to 6.1.0


My application uses the Elasticsearch Java API to communicate with the cluster. Previously, when creating an index and assigning a mapping, it was done as follows:


assign mapping:
PutMappingRequest pmr = Requests.putMappingRequest(indexName).type(userDocumentTypeName).source(userDocumentTypeMapping); client.admin().indices().putMapping(pmr).actionGet();

However, when updating to v6.1.0, this code does not work. It errors out with the message "Caused by: java.lang.IllegalArgumentException: mapping source must be pairs of fieldnames and properties definition."

My variable userDocumentTypeMapping is a string representation of a JSON object. It seems that this is not the format accepted any longer when creating a mapping. I am wondering what the proper format is and how I should go about converting my String to that format.

You can use I think setSource(userDocumentTypeMapping, XContentType.JSON) or something like this.


This did the trick: (.source rather than .setSource)
PutMappingRequest pmr = Requests.putMappingRequest(indexName).type(userDocumentTypeName).source(userDocumentTypeMapping, XContentType.JSON);

That got me past the error I mentioned. Now I've got a different error regarding the mapping itself, but at least it's not the same one. I may be back if I get stuck again, haha. Thank you for the help, David!

@dadoonet If you could please be of any help...I've run into another issue.

I'm trying to add a tokenizer and three analyzers to my index at creation time. Here's the code:
.field("type", "nGram")
.field("min_gram", 1)
.field("max_gram", 80)
.field("type", "custom")
.field("tokenizer", "keyword")
.field("filter", new String[]{"lowercase"})
.field("type", "custom")
.field("tokenizer", "uax_url_email")
.field("filter", new String[]{"standard","lowercase","stop"})
.field("type", "custom")
.field("tokenizer", "username_tokenizer")
.field("filter", new String[]{"standard","lowercase"})
.endObject().string(), XContentType.JSON))

I'm getting the following error:

Caused by: com.fasterxml.jackson.core.JsonParseException: Duplicate field 'analyzer'
at [Source: {"analysis":{"tokenizer":{"username_tokenizer":{"type":"nGram","min_gram":1,"max_gram":80}},"analyzer":{"string_lowercase":{"type":"custom","tokenizer":"keyword","filter":["lowercase"]}},"analyzer":{"email":{"type":"custom","tokenizer":"uax_url_email","filter":["standard","lowercase","stop"]}},"analyzer":{"username":{"type":"custom","tokenizer":"username_tokenizer","filter":["standard","lowercase"]}}}}; line: 1, column: 198]
at com.fasterxml.jackson.core.json.JsonReadContext._checkDup( ~[jackson-core-2.6.2.jar:2.6.2]
at com.fasterxml.jackson.core.json.JsonReadContext.setCurrentName( ~[jackson-core-2.6.2.jar:2.6.2]
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken( ~[jackson-core-2.6.2.jar:2.6.2]
at org.elasticsearch.common.xcontent.json.JsonXContentParser.nextToken( ~[elasticsearch-6.1.0.jar:6.1.0]
at org.elasticsearch.common.settings.Settings.fromXContent( ~[elasticsearch-6.1.0.jar:6.1.0]
at org.elasticsearch.common.settings.Settings.fromXContent( ~[elasticsearch-6.1.0.jar:6.1.0]
at org.elasticsearch.common.settings.Settings.fromXContent( ~[elasticsearch-6.1.0.jar:6.1.0]
at org.elasticsearch.common.settings.Settings.access$500( ~[elasticsearch-6.1.0.jar:6.1.0]
at org.elasticsearch.common.settings.Settings$Builder.loadFromSource( ~[elasticsearch-6.1.0.jar:6.1.0]

I suspect it may have something to do with creating multiple analyzers in the same JSON object. Is there a way to do this, or should I send each analyzer individually after creating the index? If so, what is the syntax to do this?

Please open a new thread.

And format your code using </> icon as explained in this guide and not the citation button. It will make your post more readable.

Or use markdown style like:


Apologies, I'm new to the forums here. I will open a new topic on this.


1 Like

I have same task at my hand can you tell me how are you planning on migrating data from es2.3 to es 6 ??

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.