Reading data from a "changing" log file

Hello out there!

I am pretty new on the elastic and hope for your help :slightly_smiling_face:

My plan is to use python to read sensor data and log it on another machine. I wanna use logstash to parse my file and send it to elasticsearch and Kibana to view my data and give results. The whole thing should work in "real time", so it should updata every time it receives new data.
My question is: How can I read a changing file (like new sensor data every 100ms in the log file) with logstash . I heard that for this type of problem you use beats . Why do I need to do that, what are pros and cons of this? What is the basic idea?

Thank you for your help!

Both Logstash and beats typically tail log files, so new data would need to be appended to the file on order for it to be captured correctly try. If you are updating rows in an existing file I do not think Filebeat or Logstash are a natural fit. Maybe it would be easier to stream the data/changes to logstash via e.g. TCP?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.