How to do automation of "index,type,id" before actual json data in elastic search

Hello ,
I have a doubt at "index,type,id" for elastic search. I had a data and converted in json and seen code below like "index,type,id" of each&every data. How can I actually do it? I've checked some tutorials and could not understand.

Ex:{ "index" : { "_index" : "pokedex", "_type" : "pokemon", "_id" : "1" } }
{ "ndex": 1, "name": "Bulbasaur" }
{ "index" : { "_index" : "pokedex", "_type" : "pokemon", "_id" : "2" } }
{ "ndex": 2, "name": "Ivysaur" }
{ "index" : { "_index" : "pokedex", "_type" : "pokemon", "_id" : "3" } }
{ "ndex": 3, "name": "Venusaur"}

How did we take or automated this format like { "index" : { "_index" : "pokedex", "_type" : "pokemon", "_id" : "1" } }, here "id" will change everytime along with the data.
Thanks, advance.

Depends what language you use. Most of our clients have helpers that can generate bulk requests for you.

1 Like

is it possible to share more info about it Mr.warkolm.
Same query --> like having json data and I want to do automatic indexing,type,id before the data same as like previous query - thanks advance

You can use Logstash to import this type of data. Where does the data come from? Is it available in a file?

thanks for reply Mr.Christian_Dahlqvist. I had data in csv format and I need to convert into json along with automated "index,type,id".Is there any way like I mention in above query plz.

You can use Logstash for this. Feed the data into Logstash through either a file input plugin or even the stdin input plugin. Then use a csv filter to parse csv lines into events, which can then be sent to Elasticsearch using the Elasticsearch output plugin.

thanks for reply Mr.Christian_Dahlqvist, But I can not do any search or analytics of the data. So I decided to convert into json, but the only problem is about automated "index,type,id stuff". Hope you had understand the problem.

The format you showed is the bulk indexing format. Logstash will take care of that for you, as will the language clients. If you want to create a file with a bulk request and send this through curl, you will need to create it according to the described format, which includes specifying document IDs in the correct places unless you want IDs to be automatically generated for you by Elasticsearch.

yes, you are right. Since I have large data to analyse, how can we convert that along with each field automated { "index" : { "index" : "_index", "type" : "_type", "id" : "_id" } } and next row will be the actual data like { "ndex": 1, "name": "Bulbasaur" }.

output should be like this right.

Ex:{ "index" : { "index" : "pokedex", "type" : "pokemon", "_id" : "1" } }
{ "ndex": 1, "name": "Bulbasaur" }
{ "index" : { "index" : "pokedex", "type" : "pokemon", "_id" : "2" } }
{ "ndex": 2, "name": "Ivysaur" }

Use Logstash or write your own script to perform the transformation.

thanks for your reply Mr.Christian_Dahlqvist

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.