Logstash not reading csv file


(fillic) #1

I am not able to see any error on running the pipeline but the data is not read either on stdout or elasticsearch. here is my pipeline. I am using 6.5.0

input{
file{
path=>"C:\TR\d\Professional\ELK\Data\test.csv"
start_position=>"beginning"
sincedb_path=>"C:\TR\d\Professional\ELK\Data\sincedb.log"
}
}filter{
csv{
separator=>","
columns=>["col1","col2","col3","col4"]
}
}output {
elasticsearch {
hosts=>"http://localhost:9200"
index=>"mynewindex"
}
}


(Lewis Barclay) #2

Try this:

input{
file{
path=>"C:/TR/d/Professional/ELK/Data/test.csv"
start_position=>"beginning"
sincedb_path=>"C:/TR/d/Professional/ELK/Data/sincedb.log"
}
}filter{
csv{
separator=>","
columns=>["col1","col2","col3","col4"]
}
}output {
elasticsearch {
hosts=>"http://localhost:9200"
index=>"mynewindex"
}
}


(fillic) #3

What is the difference and what is fixed?


(Lewis Barclay) #4

The direction of the slashes in the path.


(fillic) #5

That fixes the issue..... this lets me believe that the option --config.test_and_exit is of no use if this is not captured via this command? this is a silly thing which stopped my entire parsing and showing no error at all....


(Lewis Barclay) #6

I'm unsure, just run as a service is nicer!