I managed to force Logstash reloading the whole file by pointing the sincedb_path
to NUL
(Windows environment) and setting the start_position
at the beginning. Here is my file input
configuration:
input {
file {
path => "myfile.csv"
start_position => beginning
ignore_older => 0
type => "my_document_type"
sincedb_path => "NUL"
stat_interval => 1
}
}
The file is actually reloaded every time I restart Logstash and every time it is modified, but I want it to be reloaded each second as mentioned in stat_interval
.
I also need it to be reloaded even if there is no modification and without restarting logstash because I am adding a date based field in the filters and I need the same data every day with an updated date_field
:
filter {
csv {
columns => ["MyFirstColumn", "MySecondColumn"]
separator => ";"
add_field => {
"date_field" => "%{+ddMMyyy}"
}
}
}
Here is an example of the expected behavior :
File content :
Column A;Column B
Value X;Value Y
Data sent to Elastic search index :
Column A : Value X, Column B : Value Y, date_field : 05122016
The day after, even without modifying the file I want the following data to be added to the same index in Elasticsearch :
Column A : Value X, Column B : Value Y, date_field : 06122016