Version Conflict while using delete_by_query

I'm using logstash to insert huge data to my elasticsearch,but sometimes the grok plugin fails and insert a message with tags =_grokparsefailure.

Now i'm going to remove all data contains this tag with the request below ,but i reports a version conflict.

POST logstash-163/mail163/_delete_by_query?timeout=5m
{
"query": {
"match" : {
"tags" : "_grokparsefailure"
}
}
}

And the result is

{
"took": 676,
"timed_out": false,
"total": 285008161,
"deleted": 0,
"batches": 1,
"version_conflicts": 1000,
"noops": 0,
"retries": {
"bulk": 0,
"search": 0
},
"throttled_millis": 0,
"requests_per_second": -1,
"throttled_until_millis": 0,
"failures": [
{
"index": "logstash-163",
"type": "mail163",
"id": "AV89E_COisCbJs1cSr60",
"cause": {
"type": "version_conflict_engine_exception",
"reason": "[mail163][AV89E_COisCbJs1cSr60]: version conflict, current version [2] is different than the one provided [1]",
"index_uuid": "GBUx80OtTrWFSlYlZiTiCA",
"shard": "2",
"index": "logstash-163"
},
"status": 409
},
{
"index": "logstash-163",
"type": "mail163",
"id": "AV89E_COisCbJs1cSsBF",
"cause": {
"type": "version_conflict_engine_exception",
"reason": "[mail163][AV89E_COisCbJs1cSsBF]: version conflict, current version [2] is different than the one provided [1]",
"index_uuid": "GBUx80OtTrWFSlYlZiTiCA",
"shard": "2",
"index": "logstash-163"
},
"status": 409
},
{
"index": "logstash-163",
"type": "mail163",
"id": "AV89E_COisCbJs1cSsAk",
"cause": {
"type": "version_conflict_engine_exception",
"reason": "[mail163][AV89E_COisCbJs1cSsAk]: version conflict, current version [2] is different than the one provided [1]",
"index_uuid": "GBUx80OtTrWFSlYlZiTiCA",
"shard": "2",
"index": "logstash-163"
},
"status": 409
},
......... and some stuff likes above

Is there a solution for this? thanks :sob:

this means, that those documents were written while the delete by query operation ran. So some external tool tried to overwrite that document.

You could just run the same command again and make sure those get deleted.

Deleting 285 million documents is quite a long running operation, so it is likely that there was another indexing operation in between.

--Alex

1 Like

Thanks for your reply, but the same problem occurs again while i had restarted all and post the request .

I'm quite sure that NOTHING is trying to update or insert data into my elasticsearch .

which ES version is this?

The newest version. 5.6.3 maybe.

And there is another problem in logstash, newest version has a bug that cannot insert data into elasticsearch properly, By downgrading to 5.6.2 problems solved.

Also if my system hangs while running logstash, after force reboot u have to remove logstash completely and install it again ,or u will never be able to using it.

Oh, the problem in this thread was solved with parameter conflicts=proceed added to request.

It takes a while to delete the whole data.

8 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.