How to automate data ingestion with logstash pipelines?

Hi there,

I'm using ELK in order to ingest, store and visualize data, no fancy things..

Everything is working fine but each time I have new data to ingest I have to execute manually the command /opt/logstash/bin/logstash -f mypipeline.conf

I was wondering how to automate this last step in order to ingest the data in elastisearch each time new data arrive in the inut folder defined in my pipeline conf file?

thx

What type of input plugin are you using for logstash?

If I'm reading this correctly, you are probably using the file input plugin:

https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html

It should be able to handle new incoming data automatically, so no need to restart logstash, just leave it running on its own and check for new data IIRC.

As an aside - you might have better luck getting an answer when asking in the logstash forum.

Hello Isabel,

thx for the reply! I have moved the post to the logstash forum.

I'm indeed using the input plugin :

file {
path => "/path/to/myfiles*.csv"
start_position => beginning
sincedb_path => "/dev/null"
}

I guess I'm missing an important option that would allow to check if new files are present or not..

"discover_interval" or "stat_interval" ? or the sincedb-path ??

Logstash should automatically discover new files within a few seconds (discover_interval).