Logstash neither detect the newly added data in CSV file nor push it into elasticsearch

Hi team,

I have a CSV file which has 3 records at the beginning and parsed by logstash successfully. (I didn't close logstash at this moment)

Later, I open the file and add forth records and want logstash detect the new record automatically. But seems logstash failed to detected it and in kibana, there's still 3 records displayed.

Here is the CSV file. (the last record is newly added)

       GIT_ORG,GIT_REPOS,COMMIT_SHA1,COMMIT_AUTHOR,COMMIT_DATE
bizx,au-recruiting,6739c82bcf830b05d4d36e9fd715bc5715e0c380,Kaderjan Ilghar,2018-01-24
bizx,idl-analytics-api,1be44f52f25f6b540f284eb17e8cee5838826cb9,ssheriff,2018-01-23
bizx,idl-analytics-api,1be44f52f25f6b540f284eb17e8cee5838826cb9,ssheriff,2018-01-22
bizx,idl-analytics-api,1be44f52f25f6b540f284eb17e8cee5838826cb9,ssheriff,2018-01-21

Here is the logstash.conf

input { file{ path => "C:/elkstack/elasticsearch-6.5.1/logs/test1.csv"		
              start_position => "beginning"
			  sincedb_path => "C:/elkstack/elasticsearch-6.5.1/sincedb/sincedb.txt" }
	  }

 filter { 

    date { match => [ "COMMIT_DATE", "yyyy-MM-dd" ]
           target => "@timestamp" }
               
    if "_dateparsefailure" in [tags] { drop {} }

    csv { columns => [  "GIT_ORG",
                        "GIT_REPOS",
                        "COMMIT_SHA1",
                        "COMMIT_AUTHOR",
						"COMMIT_DATE"]
           separator => ","
		   skip_header => "true"
		   }   
   }


output {
    elasticsearch {
	 action => "index"
	 hosts  => "localhost:9200"
	 index  => "test1"
     manage_template => true
     template => "C:/elkstack/elasticsearch-6.5.1/indextemp/dtemplate.json"
     template_name=> "dtemplate"
     template_overwrite => true	 }

stdout { codec => rubydebug {metadata => true}}

}

What's the problem here?

1 observation.

  1. If i open the CSV with excel to add the new record and save. logstash doesn't detect the change.
  2. If I open the CSV file with sublime to add the new record and save. logstash can detect the change.

For 1. I use notepadd++ to open it and configure to display the end of line. Below is the screenshot! Does it because I didn't start a new line???

I cannot speak to why logstash does not notice when Excel updates the file, but the COMMIT_DATE field is created by the csv filter. It will not exist when the date filter executes, so that filter is a no-op. Move the csv filter to be before the date filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.