Hello All,
I have csv files under one folder in windows system and need to send the data in elastic index using logstash. I'm not sure why data is not showing in index,though index getting created.
Need key and value as json data in index.
Another requirement:
Same CSV Should not be processed again, not sure how to achieve this.
config:
input {
file {
path=> "path => "D:\\app\\cis\\logstash_execl_data\\*.csv""(WORKS)
**path => "D:\app\mis\logstash_execl_data\*.csv"(Dont work)**
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["brandName","hostName","instanceName","tcVersion","clientType","clientVersion","usecaseName","usecaseStepName","startTime","endTime","duration"]
}
}
output {
elasticsearch {
hosts => "https://abc:445"
ilm_pattern => "{now/d}-000001"
ilm_rollover_alias => "mis-logstash-excel-data"
ilm_policy => "mis-monitoring-common-policy"
api_key => "abc:lmn"
ssl_enabled => true
ssl_certificate_authorities => "my_path"
http_compression => true
data_stream => false
}
}
Please suggest something why data is not coming in index, as in log no error and how if data comes then same file should not process again?
Thanx