I have the following code in python that does upload JSON file to elastic search. What it does is it loops every file in the directory and upload it but unfortunately uses the default type, keyword, for all columns.
# Get the host address to connect to Elasticsearch
es = elasticsearch.Elasticsearch([{'host': 'localhost', 'port': 9200}])
in_file = r'{}'.format(str(jsonPath))
for filename in os.listdir(in_file):
if filename.endswith(".json"):
# Open JSON File
jsonFile = codecs.open(in_file + "/" + filename, "r", "utf-8")
json_file = jsonFile.read()
jsonFile.close()
# Upload JSON File to Elasticsearch[body is the content, index is the name given in UI]
res = es.bulk(body=json_file, index=jsonPathName, doc_type="doc", request_timeout=30)
print(json.dumps(res, indent=3, separators=(',', ': ')))
So I added the following code that includes the mapping before the upload function so that the specified type will be reflected in the mapping:
es.indices.put_mapping(
index=jsonPathName,
doc_type="doc",
body={
"properties": {
"location": {"type": "keyword"},
"name_1": {"type": "keyword"},
"name_2": {"type": "keyword"},
"door_activity": {"type": "keyword"},
"date": {"type": "date", "format": "yyyy-MM-dd HH:mm:ss:SSS"}
}
}
)
Unfortunately the code above resulted the following error message:
*elasticsearch.exceptions.RequestError: RequestError(400, 'illegal_argument_exception', 'Rejecting mapping update to [door_activity_test_new_02] as the final mapping would have more than 1 type: [_doc, doc]')*
Therefore, can anyone please advice me on how I can rectify the above mentioned issue?