Varying schema problem

I have a tons of complicated xml files I want to transform to json, then pushing to elasticsearch. When pushing to elastichsearch it throws some exceptions such as number format

Caused by: java.lang.NumberFormatException: For input string: "664;3392"
...
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [tag1.subtag1]

However because each xml file varies drastically. The first json, content converted from xml, get indexed would be very different from the rest of xml. So I would like to check if there's any way to update mapping dynamically?

I appreciate any suggestions.

You need to force the mapping before the first document gets indexed.

Use a text type for text fields.

Updating by forcing mapping to string type solve my problem. Thanks for your help.