Create Mapping Before Uploading Json Data

I have the following code in python that does upload JSON file to elastic search. What it does is it loops every file in the directory and upload it but unfortunately uses the default type, keyword, for all columns.

# Get the host address to connect to Elasticsearch
    es = elasticsearch.Elasticsearch([{'host': 'localhost', 'port': 9200}])
    in_file = r'{}'.format(str(jsonPath))

    for filename in os.listdir(in_file):
        if filename.endswith(".json"):
            # Open JSON File
            jsonFile = + "/" + filename, "r", "utf-8")
            json_file =

            # Upload JSON File to Elasticsearch[body is the content, index is the name given in UI]
            res = es.bulk(body=json_file, index=jsonPathName, doc_type="doc", request_timeout=30)
            print(json.dumps(res, indent=3, separators=(',', ': ')))

So I added the following code that includes the mapping before the upload function so that the specified type will be reflected in the mapping:

            "properties": {
                "location": {"type": "keyword"},
                "name_1": {"type": "keyword"},
                "name_2": {"type": "keyword"},
                "door_activity": {"type": "keyword"},
                "date": {"type": "date", "format": "yyyy-MM-dd HH:mm:ss:SSS"}

Unfortunately the code above resulted the following error message:
*elasticsearch.exceptions.RequestError: RequestError(400, 'illegal_argument_exception', 'Rejecting mapping update to [door_activity_test_new_02] as the final mapping would have more than 1 type: [_doc, doc]')*

Therefore, can anyone please advice me on how I can rectify the above mentioned issue?

the response is in the error message you already created an index with a type called "_doc" and you set a mapping for the doc_type="doc"

Try replace doc with _doc.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.