Scheduled CSV File Update


I have a conf file which uploads a CSV to logstash.

input {
file {
path => "/path/to/data/data.csv"
start_position => "beginning"
sincedb_path =>"dev/null"
filter {
csv {
separator => ","
columns => ["A", "B", "date"]
mutate {convert => ["A", "integer"]}
mutate {convert => ["B", "integer"]}
output {

elasticsearch {
hosts=> ["xxxx"]
index => ""
user => **
password => ***
ssl => true
ssl_certificate_verification => false
cacert => '/export1/opt/ELK_Bin/elasticsearch-5.4.0/config/x-pack/ca/ca.crt'
stdout {}

I want to update this CSV every day at a certain time. Is there a way I can do schedule this job?

I'd also like the "date" field to become the timestamp field in ELK to correspond with "date" in the CSV file. So regardless of what time it was uploaded, it should reflect the "time" column in my CSV.

Can anyone help me with this? Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.