Logstash cvs plugin not sending data to index

Hello All,

I have csv files under one folder in windows system and need to send the data in elastic index using logstash. I'm not sure why data is not showing in index,though index getting created.
Need key and value as json data in index.


Another requirement:
Same CSV Should not be processed again, not sure how to achieve this.

config:

input {
file {
      path=> "path => "D:\\app\\cis\\logstash_execl_data\\*.csv""(WORKS)
      **path => "D:\app\mis\logstash_execl_data\*.csv"(Dont work)**
      start_position => "beginning"

     }
}

filter {
        csv {
        separator => ","
        columns => ["brandName","hostName","instanceName","tcVersion","clientType","clientVersion","usecaseName","usecaseStepName","startTime","endTime","duration"]
}
}

output {
	elasticsearch {
		hosts => "https://abc:445"
		ilm_pattern => "{now/d}-000001"
		ilm_rollover_alias => "mis-logstash-excel-data"
		ilm_policy => "mis-monitoring-common-policy"
		api_key => "abc:lmn"
		ssl_enabled => true
		ssl_certificate_authorities => "my_path"
		http_compression => true
		data_stream => false
	}  
}

Please suggest something why data is not coming in index, as in log no error and how if data comes then same file should not process again?

Thanx

Change to:
path=> "D:/app/cis/logstash_execl_data/*.csv""

@Rios

Many thanx, this worked.
Now I want that again this excel file once processed should not be processed again.How this can be managed?

You can:

  1. Use sincedb to track read files which is the most common scenario
    sincedb_path => "c:/path/file.db"
  2. Delete file after reading with file_completed_action

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.