Logstash not reading csv file

I am not able to see any error on running the pipeline but the data is not read either on stdout or elasticsearch. here is my pipeline. I am using 6.5.0

input{
file{
path=>"C:\TR\d\Professional\ELK\Data\test.csv"
start_position=>"beginning"
sincedb_path=>"C:\TR\d\Professional\ELK\Data\sincedb.log"
}
}filter{
csv{
separator=>","
columns=>["col1","col2","col3","col4"]
}
}output {
elasticsearch {
hosts=>"http://localhost:9200"
index=>"mynewindex"
}
}

Try this:

input{
file{
path=>"C:/TR/d/Professional/ELK/Data/test.csv"
start_position=>"beginning"
sincedb_path=>"C:/TR/d/Professional/ELK/Data/sincedb.log"
}
}filter{
csv{
separator=>","
columns=>["col1","col2","col3","col4"]
}
}output {
elasticsearch {
hosts=>"http://localhost:9200"
index=>"mynewindex"
}
}

What is the difference and what is fixed?

The direction of the slashes in the path.

That fixes the issue..... this lets me believe that the option --config.test_and_exit is of no use if this is not captured via this command? this is a silly thing which stopped my entire parsing and showing no error at all....

I'm unsure, just run as a service is nicer!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.