Running the ELK stack on windows

Hello. I am currently running the ELK stack through the batch files on a Windows machine. I noticed when I close the batch files and then reopen the services the data remains in elasticsearch. I figured out that to delete this data I would need to go to the directory where elasticsearch held its data and delete the nodes directory. This removes the data from elasticsearch:


What I can't seem to figure out is how to get logstash to read files that were in there from the previous run. When restarted logstash will only parse new files that are placed in where I read the files from.

For example, if I had a file called alarm.prn in the directory where it reads data from. I close/stop logstash and delete the elasticsearch node information so all of the parsed information is gone. When I restart elasticsearch there is no data present. When I restart logstash it doesn't read data from alarm.prn. If I place a new file, lets call it syslog.prn, it will read only that file. Can I have it read both files? I'm sure it stored a file somewhere saying which files it has read but can't seem to figure out where.


[quote="CDR, post:1, topic:91202"]I figured out that to delete this data I would need to go to the directory where Elasticsearch held its data and delete the nodes directory

Always use the APIs to remove data, don't do it on the filesystem.

Look at File input plugin | Logstash Reference [7.15] | Elastic

Is there a way to use the APIs without downloading any new software? On a Windows machine I am unsure how to use the delete commands. I know that in a linux machine I could use the curl commands. But windows doesn't support that without downloading other software, at least to the best of my knowledge

If you have Kibana then use Console, under dev tools.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.