Large amount of data and pagination

Hello ES community,

I am trying to delete a large amount of index data, the script i am using deletes just 10 documents from each index I’ve configured.

each index contains 10 000+ documents.
is there a way to purge all data at the first time.

I don't want to delete and recreate them, but just purge all data

The by far easiest way to delete all data is indeed to delete and recreate the indices. You can use the delete by query API, but this deletes individual documents and is much slower.

Hello @Christian_Dahlqvist,
thanks for your answer.

my indexs are in a distant server and they are created by such application with such rights, so i don't want to delete them to avoid having indexs down somewhere or some rights of new created ones are wrong. by the way i am using Ansible to manage my distant ES nodes.

i am thinking in paginating and looping over shards, but i face some problems until now

If you want to delete data you have the two options I listed unless you want to write your own application/script to do it.

1 Like

@liban We can get 1000 records at a time at most and we can increase that by using scroll API. I am doing that for my 20,00,000 records. You can implement the same type of functionality for yourself to delete records by creating a script of your own.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.