How to work with realtime data

I want to send realtime logs from a proxy to elasticsearch (through logstash) and display information about those logs realtime.

I already have made it possible to import a csv file into elasticsearch (through logstash) and have created a kibana dashboard for it.
But that's a one time import of a file and I don't understand how it works for realtime logs.
Should logstash reload every 1s to execute the .conf files?

Hope someone can help me in the right direction.

you need a mechanism to continuously send data to logstash. So maybe you can either use your existing mechanism to send data all the time. Or you could use something like filebeat to keep sending the contents of a file to logstash.

See https://www.elastic.co/guide/en/beats/filebeat/6.6/filebeat-getting-started.html

Thanks for the reply.

The mechanism is now: a csv file will be send to logstash. In logstash I created a .conf file to load in the csv file, parse it and send it to elasticsearch. After I created the file, I manually run my created .conf file. Then I made a dashboard about it.

When my server will constantly send new csv files to logstash, I get multiple csv files in one folder. To get a realtime dashboard about it, I need to refresh the data import.
I saw somewhere in a config file an option to reload files or modules. Will that be a solution for handeling realtime data? To constantly reload my .conf file so it will import new csv's to elasticsearch?

Hope to hear from you soon.

maybe there is no need to send whole csv files but rather new csv lines over the network to a logstash tcp/http input for example?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.