Newbie to ELK Stack here and have just deployed it to my environment after testing it out in a lab for a few weeks. Its going great and I have all my servers reporting in after pushing filebeat with Ansible, but I'm finding some logs with lots of data I don't want in ELK.
I'm filtering this out after the fact in the filebeat.yml files, but I want to get rid of the data from Elasticsearch, as for example I found some of my hosts with over 1GB of /var/log/messages per day which I don't want.
So far all I've found is curl 'http://10.10.10.1:9200/_cat/indices?v' to search for the indexes and then curl -XDELETE 'http://10.10.10.1:9200/filebeat-2017.08.03' to remove them. THis works fine as the data at this moment isn't important so I can just flush on mass, but my questions are:-1:
- Does this flush everything, or is there other stuff I need to clear out? I have a feeling I'm just removing the index and not raw data?
- Is there a way to do this but just target one specific beats.hostname?