Wiping data from ELK

I'm in rabid experimentation mode here. I'm set up like this:

Filebeat -> Elasticsearch | Logstash | nginx | Kibana.

I'd like to know how to wipe data from an ELK stack (6.4.0) I'm running (on Ubuntu Server 18.04) without tossing everything and starting over from scratch. What files? Etc. In particular, when I relaunch Kibana, I don't want to see any vestiges of what was in there previously.

Thanks for any comments, advice, etc.
Russ

Just delete all the filebeat/logstash indices and you should be good.

I'm new at this. What are these? Where do I delete them? Are they in files or this something I do from Kibana?

Make sure you're not sending new data into the cluster, first. Then, you could do it by either issuing a call to the API:

curl -XDELETE localhost:9200/_all

Or by stopping the elasticsearch service and then going into your data directory and deleting all data.

If you're running Security with X-pack, this will render your cluster inaccessible, since you're deleting the security index on top of everything else.

Great, this sounds easily done. Shut down Filebeat, use the curl command. I'll take this opportunity to play with Elasticsearch anyway that way, finding its data directory, etc. since part of this trip is to get my arms all the way around all of this. (Oh, yeah, I'm not using security or X-pack stuff yet.)

Thanks for your help!

Ah, hence indices: Elasticsearch's data is on the path /var/lib/elasticsearch/nodes and there's a subdirectory named indices. Now I understand the first answer.

Don't delete the filesystem, use the APIs as suggested.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.