Logstash csv ingestion with single row updates frequently

I ingested csv in elasticsearch using logstash and also put that ingesting script running in background for continues ingestion.

It's ingest if new row added to csv but if I have only one row in csv and it's changes frequently as per logs then it's not updating on elasticsearch, like logstash script not able to figure out that it's input csv updated.

ex. My csv content:

10/10/2017 10:22,Chicago,23412

And it's already ingested in elasticsearch.

Now I update csv as:

10/10/2017 10:32,Chicago,55675

So, here I have one script runing every 10 min and update csv accordingly.

So I have column 1st value update every time in first row but after this update logstash script runing on background continuously not updating values in ealsticsearch.

Is there any parameters I have to add in my lofgstash conf.

My log stash conf as below:

        input {
           file {
             path => "population.csv"
             start_position => "beginning"
             sincedb_path => "/dev/null"
             ignore_older => 0
           }
        }
        filter {
            csv {
              separator => ","
              columns => ["timestamp","city","population"]
            }

            date {
              match => [ "timestamp", "MM/dd/YY HH:mm"]
              target => "timestamp_ingestion"
            }
          }
          output {
              elasticsearch {
                hosts => "http://localhost:9200"
                index => "data-index"
              }
              stdout {}
          }

Can you please suggest any config parameters I need to add?

Duplicate of Logstash ingest data after any updates in CSV input.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.