Logstash ingest data after any updates in CSV input

I ingested csv in elasticsearch using logstash and also put that ingesting script running in background for continues ingestion.

It's ingest if new row added to csv but if I have any row values update then it's not updating on elasticsearch, like logstash script not able to figure out that it's input csv updated.

ex. My csv content:

10/10/2017 10:22,Chicago,23412
10/10/2017 10:30,New York,67890

And it's already ingested in elasticsearch.

Now I update csv as:

10/10/2017 10:22,Chicago,56789
10/10/2017 10:30,New York,67890

So I have column 3rd value update in first row but after this update logstash script runing on background continuously not updating values in ealsticsearch.

Is there any parameters I have to add in my lofgstash conf.

My log stash conf as below:

input {
  file {
    path => "population.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    ignore_older => 0
  }
}
filter {
  csv {
    separator => ","
    columns => ["timestamp","city","population"]
  }

  date {
    match => [ "timestamp", "MM/dd/YY HH:mm"]
    target => "timestamp_ingestion"
  }
}
output {
   elasticsearch {
     hosts => "http://localhost:9200"
     index => "data-index"
  }
stdout {}
}

Can you please suggest any config parameters I need to add?

Exactly how is the CSV file being updated? Is it updated in place (overwritten)? Is a new file being created?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.