Hello. I am currently running the ELK stack through the batch files on a Windows machine. I noticed when I close the batch files and then reopen the services the data remains in elasticsearch. I figured out that to delete this data I would need to go to the directory where elasticsearch held its data and delete the nodes directory. This removes the data from elasticsearch:
What I can't seem to figure out is how to get logstash to read files that were in there from the previous run. When restarted logstash will only parse new files that are placed in where I read the files from.
For example, if I had a file called alarm.prn in the directory where it reads data from. I close/stop logstash and delete the elasticsearch node information so all of the parsed information is gone. When I restart elasticsearch there is no data present. When I restart logstash it doesn't read data from alarm.prn. If I place a new file, lets call it syslog.prn, it will read only that file. Can I have it read both files? I'm sure it stored a file somewhere saying which files it has read but can't seem to figure out where.