Hi Team,
I have a CSV file which gets updated (delete, add, change in the data in fields) regularly. The columns stay constant only the data changes. I want to write a code in logstash, which should be able to capture these changes withouht running the logstash command again and again. Basically, such changes should be tracked live.
I am very new to logstash (amateur) and have written the following code.
input {
file {
path => "C:/Aniket/route_maps.csv"
sincedb_path => "NUL"
type => "maps"
start_position => "beginning"
#mode => "tail"
#ignore_older => 0
}
}
filter {
csv {
separator => ","
columns => ["SrNo","NameofCountry","Cityname","PathID","PathOrder","FlowAmount","Latitude","Longitude"]
skip_header => "true"
}
mutate {
convert => {
"Latitude" => "float"
"Longitude" => "float"
"SrNo" => "float"
}
}
if [SrNo] == 0.0 { drop {} }
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "route_csv"
action => "index"
document_id => "%{SrNo}"
doc_as_upsert => true
workers => 1
}
stdout { codec => rubydebug
}
}
THe output i am getting is not desirable, as the modification is not captured correclty
Can you please help?
THanks,
Aniket