Modifying a CSV file - Logstash

Hi Team,

I have a CSV file which gets updated (delete, add, change in the data in fields) regularly. The columns stay constant only the data changes. I want to write a code in logstash, which should be able to capture these changes withouht running the logstash command again and again. Basically, such changes should be tracked live.

I am very new to logstash (amateur) and have written the following code.

input {
file {
path => "C:/Aniket/route_maps.csv"
sincedb_path => "NUL"
type => "maps"
start_position => "beginning"
#mode => "tail"
#ignore_older => 0

filter {
csv {
separator => ","
columns => ["SrNo","NameofCountry","Cityname","PathID","PathOrder","FlowAmount","Latitude","Longitude"]
skip_header => "true"

mutate {
convert => {
"Latitude" => "float"
"Longitude" => "float"
"SrNo" => "float"

if [SrNo] == 0.0 { drop {} }
output {

elasticsearch {
hosts => "http://localhost:9200"
index => "route_csv"
action => "index"
document_id => "%{SrNo}"
doc_as_upsert => true
workers => 1
stdout { codec => rubydebug

THe output i am getting is not desirable, as the modification is not captured correclty

Can you please help?



A file input is intended to tail log files that are being appended to, not to re-read files that are getting overwritten. You could try using an exec input to repeatedly type the file.

Even appending is not happening properly.

Hello Team, any one who can help me on this?? need it on an urgent basis.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.