Removing data from Elasticsearch

Newbie to ELK Stack here and have just deployed it to my environment after testing it out in a lab for a few weeks. Its going great and I have all my servers reporting in after pushing filebeat with Ansible, but I'm finding some logs with lots of data I don't want in ELK.

I'm filtering this out after the fact in the filebeat.yml files, but I want to get rid of the data from Elasticsearch, as for example I found some of my hosts with over 1GB of /var/log/messages per day which I don't want.

So far all I've found is curl 'http://10.10.10.1:9200/_cat/indices?v' to search for the indexes and then curl -XDELETE 'http://10.10.10.1:9200/filebeat-2017.08.03' to remove them. THis works fine as the data at this moment isn't important so I can just flush on mass, but my questions are:-1:

  1. Does this flush everything, or is there other stuff I need to clear out? I have a feeling I'm just removing the index and not raw data?
  2. Is there a way to do this but just target one specific beats.hostname?

Deleting an index leads to the underlying data being deleted.

For your second question, you can use delete-by-query.

Thanks, thats good to know it deletes the data too.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.