Add dictionary to index

Very simple and straight forward question:
Is there a way to push/insert a whole dictionary into my index pattern?
without iterate over the dictionary and push each object into my index pattern

Thx a lot :thumbsup:

We accept JSON as the format for documents which is obviously capable of representing arrays with complex objects.
However, by default the underlying index used for search is a flattened representation and if you need to match >1 property of nested objects in your searches you may need to mark these objects as "nested" in your index schema definition. See

Tnx @Mark_Harwood
The thing is that I'm delete and rebuild my index pattern every day automatically. I'm not sure that add my schema definition will help
When I'm trying to insert a dict to my index pattern, Elastic pop an error "unsupported index mapping". Only iteration on the dict helps

Any idea?

I think I need to see examples of the commands you're issuing and the errors you get back to understand more.

An "index pattern" is what Kibana calls the string that is used to configure access to one or more concrete indices e.g. they might query index pattern logs* but under the covers elasticsearch would access indices logs20170101 and logs20170102. When you index content you refer to a single physical index not a pattern otherwise we wouldn't know which of the potentially multiple indices you want to store content in.

Tnx for the explanation :v:
I'm using Python as an intermediate script to fetch data from my API into ES
The code is:

           if res.status_code == 200:
                       lbapi_Response_Data = lbapi_results['Results']
                       del item['_id']
                       elasticsearch.index(index="rally_look_back", doc_type="Defect", body=lbapi_Response_Data)

lbapi_Response_Data is a dict and usually I'm iterating over it object by object and push it into my index
When running to code above, I'm getting the error:
raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code, error_message, additional_info)
RequestError: TransportError(400, u'mapper_parsing_exception', u'failed to parse')

If you have Kibana installed you can use the "dev tools" to test raw elasticsearch interactions which can be easily shared on this forum. It also makes debugging easier if we can remove the client layers (in your case the Python client) from the equation. There's 2 parts to this - what you sort of thing you claim you will store and what sort of thing you actually then try store. The former is the mapping which you can get directly from the REST api (using Kibana dev tools or CURL as follows:)

GET rally_look_back/_mapping

To get the JSON that is being sent to be indexed I'd need to see the serialized form of lbapi_Response_Data if you can print that out

I'm using Kibana, but why to remove python?
See mapping here

To get the JSON that is being sent to be indexed I'd need to see the serialized form of lbapi_Response_Data if you can print that out
What do you mean?
(How do you create reference to previous comment in your comment?)

Python talks to the REST layer. Posting raw REST requests here means anyone can copy/paste the commands into Kibana or CURL to recreate your issue.

In your python code:

print lbapi_Response_Data

It's sounds like security issue. What's it relevant to this topic?

It's a bit problem due to internal data. Any chance I can describe the data structure?

You need to supply an example people can reproduce. One command to create an index mapping. The other to insert some data into that index. Often going through the process of putting together a simplified example such as this will reveal the issue.

Got it !
My python is the tool that helps me communicate between the API and ES
What's the alternative?

The Kibana dev tools (pictured) are one way. CURL is another.

Regardless, let's try another tack - can you share the stack trace from the server when you get the error?

Sure @Mark_Harwood

Looks like the client stack trace

My python script fetch data from the API and index it into my ES
That's all the component I'm using
This error occur when trying:
es.index(index="rally_look_back", doc_type="Defect", body=lbapi_Response_Data)

The server stack trace can be found in the "logs" directory of the elasticsearch server.

Sorry can't attached the file due to file suffix

[2017-07-12T09:43:01,665][INFO ][o.e.c.m.MetaDataDeleteIndexService] [o0WbhJX] [rally_look_back/OcyGznwZQzeqADI5-uf0UQ] deleting index
[2017-07-12T10:21:50,466][INFO ][o.e.c.m.MetaDataCreateIndexService] [o0WbhJX] [rally_look_back] creating index, cause [auto(bulk api)], templates [], shards [5]/[1], mappings []
[2017-07-12T10:21:50,547][DEBUG][o.e.a.b.TransportShardBulkAction] [o0WbhJX] [rally_look_back][1] failed to execute bulk item (index) BulkShardRequest [[rally_look_back][1]] containing [index {[rally_look_back][Defect][AV01rOjSwejzqYBhaG6D], source[n/a, actual length: [154.5kb], max length: 2kb]}]
org.elasticsearch.index.mapper.MapperParsingException: failed to parse
at org.elasticsearch.index.mapper.DocumentParser.wrapInMapperParsingException( ~[elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.index.mapper.DocumentParser.parseDocument( ~[elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.index.mapper.DocumentMapper.parse( ~[elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.index.shard.IndexShard.prepareIndex( ~[elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.index.shard.IndexShard.prepareIndexOnPrimary( ~[elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.prepareIndexOperationOnPrimary( ~[elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.executeIndexRequestOnPrimary( ~[elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.executeBulkItemRequest( [elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary( [elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary( [elasticsearch-5.4.2.jar:5.4.2]
at$PrimaryShardReference.perform( [elasticsearch-5.4.2.jar:5.4.2]
at$PrimaryShardReference.perform( [elasticsearch-5.4.2.jar:5.4.2]
at [elasticsearch-5.4.2.jar:5.4.2]
at$AsyncPrimaryAction.onResponse( [elasticsearch-5.4.2.jar:5.4.2]
at$AsyncPrimaryAction.onResponse( [elasticsearch-5.4.2.jar:5.4.2]
at$1.onResponse( [elasticsearch-5.4.2.jar:5.4.2]
at$1.onResponse( [elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.index.shard.IndexShardOperationsLock.acquire( [elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.index.shard.IndexShard.acquirePrimaryOperationLock( [elasticsearch-5.4.2.jar:5.4.2]
at [elasticsearch-5.4.2.jar:5.4.2]
at$400( [elasticsearch-5.4.2.jar:5.4.2]
at$AsyncPrimaryAction.doRun( [elasticsearch-5.4.2.jar:5.4.2]
at [elasticsearch-5.4.2.jar:5.4.2]
at$PrimaryOperationTransportHandler.messageReceived( [elasticsearch-5.4.2.jar:5.4.2]
at$PrimaryOperationTransportHandler.messageReceived( [elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived( [elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.transport.TransportService$7.doRun( [elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun( [elasticsearch-5.4.2.jar:5.4.2]
at [elasticsearch-5.4.2.jar:5.4.2]
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:1.8.0_131]
at java.util.concurrent.ThreadPoolExecutor$ Source) [?:1.8.0_131]
at Source) [?:1.8.0_131]
Caused by: org.elasticsearch.common.compress.NotXContentException: Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes
at org.elasticsearch.common.compress.CompressorFactory.compressor( ~[elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.common.xcontent.XContentHelper.createParser( ~[elasticsearch-5.4.2.jar:5.4.2]
at org.elasticsearch.index.mapper.DocumentParser.parseDocument( ~[elasticsearch-5.4.2.jar:5.4.2]
... 30 more
[2017-07-12T12:13:53,570][INFO ][o.e.c.m.MetaDataDeleteIndexService] [o0WbhJX] [rally_look_back/dwtzT4qvTZCppcv3p6i2DA] deleting index

Looks like invalid JSON is being sent. Need to see the contents of lbapi_Response_Data

If JSON is invalid, how come that I can add obejcts into my index if I iterate over lbapi_Response_Data ?

Because it's an array not a dict?