ES has lost a portion of its data by importing the json data using python3 helpers.bulk and specifying _id

Elasticsearch version number is 2.3.3
this is my json file:
{"_source": {"bd_id": 12345, "date_type": "BEFORE_SEVEN_DAY"}, "_score": 7.5887613, "_id": "153380527BEFORE_SEVEN_DAY", "_index": "ex_data_1", "_type": "ex_shop"}
{"_source": {"bd_id": 1888, "date_type": "BEFORE_SEVEN_DAY"}, "_score": 7.5887613, "_id": "151008189BEFORE_SEVEN_DAY", "_index": "ex_data_1", "_type": "ex_shop"}
....

And my code

#open josn file
    print('import data begin...')
    with open(self.re_index+'_data'+'.json', 'r',encoding='utf-8' ) as e:
        actions = deque()
        j = 0
        for i in e:
            action =  {
                '_op_type': 'index',
                '_index' : json.loads(i)['_index'],
                '_type' : json.loads(i)['_type'],
                '_id' : json.loads(i)['_id'],
                '_source' : json.loads(i)['_source']
            }
            actions.append(action)
        print(len(actions))
   #helpers.bulk 
    for success, info in elasticsearch.helpers.parallel_bulk(es, actions,thread_count=50):
        if not success:
            print('Doc failed', info)
    print('import data end...\n\t total consuming time:'+str(time.time()-ip_begin)+'s')

Here is The results of the implementation:
import data begin...
60000
import data end...
total consuming time:19.1480000019073486s
ES__bulk

what's matter with code or helpers.bulk??
I need help!!!
@dadoonet

Are people asleep????????????

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.