I want to send realtime logs from a proxy to elasticsearch (through logstash) and display information about those logs realtime.
I already have made it possible to import a csv file into elasticsearch (through logstash) and have created a kibana dashboard for it.
But that's a one time import of a file and I don't understand how it works for realtime logs.
Should logstash reload every 1s to execute the .conf files?
you need a mechanism to continuously send data to logstash. So maybe you can either use your existing mechanism to send data all the time. Or you could use something like filebeat to keep sending the contents of a file to logstash.
The mechanism is now: a csv file will be send to logstash. In logstash I created a .conf file to load in the csv file, parse it and send it to elasticsearch. After I created the file, I manually run my created .conf file. Then I made a dashboard about it.
When my server will constantly send new csv files to logstash, I get multiple csv files in one folder. To get a realtime dashboard about it, I need to refresh the data import.
I saw somewhere in a config file an option to reload files or modules. Will that be a solution for handeling realtime data? To constantly reload my .conf file so it will import new csv's to elasticsearch?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.