Same query, Different results

Hi,
I just ran into a problem while executing a very basic query. I am trying to search for documents based on user_id.
The total result count keeps on fluctuating between 0 and 200.

This is the output that i get

Thanks

4 Likes

These things are expected if you have multiple replicas since they do not refresh at the same time. Are you indexing concurrently to your search requests?

No,
This occurred when i tried to delete documents for a specific user.
200 documents were not deleted.
any following queries on the index produced this problem.

Thanks this makes sense now, I believe that you are using Elasticsearch 1.7? Deleting by query can indeed cause indices to get in an inconsistent state, which is why it has been moved to a plugin that has a completely different way of working. As a workaround, you could try to delete all documents from this user again so that the shards that still have data remove it as well and hopefully your index will become consistent again. I would advise to avoid using delete-by-query again to avoid these issues.

I am already using delete-by-query plugin and the version is 2.4

Hmm this is more worrying. The delete-by-query plugin should not cause such issues. Was this index created in 2.x or is it an index that you imported from a 1.x install?

The index was created in 2.4

Can you check that the issue is caused by replicas getting out of sync? Ie. does the problem persist if you set a preference at search time? Can you also provide the output of the cat/shards API?

@jpountz
I set the search preference to primary and the count was 60207 and for replica the count was 60107.
The replica and the primary are not in sync.

The output for cat/shards is

prod_glass_v3_sep2016 6 p STARTED 1378541 186mb 10.141.34.236 NODE-01
prod_glass_v3_sep2016 6 r STARTED 1378541 160.4mb 10.37.215.218 NODE-02
prod_glass_v3_sep2016 3 p STARTED 1379140 195.3mb 10.141.34.236 NODE-01
prod_glass_v3_sep2016 3 r STARTED 1379140 170.9mb 10.16.233.92 NODE-03
prod_glass_v3_sep2016 2 r STARTED 1370682 160.9mb 10.141.34.236 NODE-01
prod_glass_v3_sep2016 2 p STARTED 1370682 182.8mb 10.16.233.92 NODE-03
prod_glass_v3_sep2016 7 p STARTED 1379207 202.1mb 10.37.215.218 NODE-02
prod_glass_v3_sep2016 7 r STARTED 1379207 177.8mb 10.16.233.92 NODE-03
prod_glass_v3_sep2016 4 p STARTED 1389651 169.7mb 10.37.215.218 NODE-02
prod_glass_v3_sep2016 4 r STARTED 1389651 161.6mb 10.16.233.92 NODE-03
prod_glass_v3_sep2016 5 r STARTED 1377090 214.3mb 10.37.215.218 NODE-02
prod_glass_v3_sep2016 5 p STARTED 1377090 189.9mb 10.16.233.92 NODE-03
prod_glass_v3_sep2016 1 r STARTED 1378293 204.2mb 10.141.34.236 NODE-01
prod_glass_v3_sep2016 1 p STARTED 1379434 170.2mb 10.37.215.218 NODE-02
prod_glass_v3_sep2016 0 p STARTED 1381910 161.2mb 10.141.34.236 NODE-01
prod_glass_v3_sep2016 0 r STARTED 1381910 174.2mb 10.16.233.92 NODE-03