Hi team,
I have a CSV file which has 3 records at the beginning and parsed by logstash successfully. (I didn't close logstash at this moment)
Later, I open the file and add forth records and want logstash detect the new record automatically. But seems logstash failed to detected it and in kibana, there's still 3 records displayed.
Here is the CSV file. (the last record is newly added)
       GIT_ORG,GIT_REPOS,COMMIT_SHA1,COMMIT_AUTHOR,COMMIT_DATE
bizx,au-recruiting,6739c82bcf830b05d4d36e9fd715bc5715e0c380,Kaderjan Ilghar,2018-01-24
bizx,idl-analytics-api,1be44f52f25f6b540f284eb17e8cee5838826cb9,ssheriff,2018-01-23
bizx,idl-analytics-api,1be44f52f25f6b540f284eb17e8cee5838826cb9,ssheriff,2018-01-22
bizx,idl-analytics-api,1be44f52f25f6b540f284eb17e8cee5838826cb9,ssheriff,2018-01-21
Here is the logstash.conf
input { file{ path => "C:/elkstack/elasticsearch-6.5.1/logs/test1.csv"		
              start_position => "beginning"
			  sincedb_path => "C:/elkstack/elasticsearch-6.5.1/sincedb/sincedb.txt" }
	  }
 filter { 
    date { match => [ "COMMIT_DATE", "yyyy-MM-dd" ]
           target => "@timestamp" }
               
    if "_dateparsefailure" in [tags] { drop {} }
    csv { columns => [  "GIT_ORG",
                        "GIT_REPOS",
                        "COMMIT_SHA1",
                        "COMMIT_AUTHOR",
						"COMMIT_DATE"]
           separator => ","
		   skip_header => "true"
		   }   
   }
output {
    elasticsearch {
	 action => "index"
	 hosts  => "localhost:9200"
	 index  => "test1"
     manage_template => true
     template => "C:/elkstack/elasticsearch-6.5.1/indextemp/dtemplate.json"
     template_name=> "dtemplate"
     template_overwrite => true	 }
stdout { codec => rubydebug {metadata => true}}
}
What's the problem here?
