Hello,
I am working in a setup on which Logstash will look into a local specific directory for JSON files to parse and forward to Elasticsearch. Those files will be generated daily and placed on the directory where Logstash is monitoring, so there will be a new uniquely named JSON file every day.
My input is as such:
input {
file {
path => "/home/path_to_json/*.json"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
My question is, how do I configure Logstash to only ingest the latest/newest file, and not everthing else on the directory everytime a new file gets dumped, so that it will not duplicate data on Elasticsearch? Is this the default behavior of the File plugin? Or should I set up anything new on my input?
Thanks in advance!