Error in logstash reading a csv file

I am trying to read a csv file with logstash .
Here is my config file:

input {
file{
path => ["/home/abhinavkumar.gurung/Applications/csv/devian/data/autos.csv"]
type => "cars"
start_position => "beginning"
sincedb_path => "/dev/null"
}

}
filter {
csv{
columns => ["dateCrawled","name","seller",
"offerType","price","abtest","vechicletype","yearOfRegistration","gearbox","powerPS","model","kilometer","monthOfRegistration","fuelType","brand","notRepairedDamage","dateCreated","nrOfPictures","postalCode","lastSeen"]
separator =>","
convert => {"price" => "integer"
"yearOfRegistration" => "integer"
"powerPS" => "integer"
"kilometer" => "integer"
"nrOfPictures" => "integer"
"postalCode" => "integer"
}
remove_field =>["message","dateCrawled","monthOfRegistration","dateCreated","lastSeen"]
skip_header =>"true"
}

}
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
user => elastic
password => changeme
index => "cars"
}
stdout {
codec => rubydebug
}
}

My error:

    `Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 24, column 8 (byte 696) after filter {\n csv{\n\tcolumns => [\"dateCrawled\",\"name\",\"seller\",\n\t\"offerType\",\"price\",\"abtest\",\"vechicletype\",\"yearOfRegistration\",\"gearbox\",\"powerPS\",\"model\",\"kilometer\",\"monthOfRegistration\",\"fuelType\",\"brand\",\"notRepairedDamage\",\"dateCreated\",\"nrOfPictures\",\"postalCode\",\"lastSeen\"]\n \tseparator =>\",\"\n\tconvert => {\"price\" => \"integer\"\n\t\t\t\t\t\t\t\"yearOfRegistration\" => \"integer\"\n\t\t\t\t\t\t\t\"powerPS\" => \"integer\"\n\t\t\t\t\t\t\t\"kilometer\" => \"integer\"\n\t\t\t\t\t\t\t\"nrOfPictures\" => \"integer\"\n\t\t\t\t\t\t\t\"postalCode\" => \"integer\"\n\t\n\n\t}\n\tmutate", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2577:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:23:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}`

The log message indicates it fails when it reaches a mutate after the csv convert option, but the configuration you post does not have a mutate, so it is not the configuration you are running with.

So I changed my csv file (much smaller and simpler)

input {
file{
path => ["/home/abhinavkumar.gurung/Applications/csv/devian/data/Admission_Predict.csv"]
start_position => "beginning"
sincedb_path => "/dev/null"
codec => plain {charset => "UTF-8"}
}

}
filter {
csv{
columns => ["Serial No","GRE Score","TOEFL Score","University Rating","SOP","LOR","CGPA","Research","Chance of Admit"]
convert => {"Serial No" => "integer"
"GRE Score" => "integer"
"TOEFL Score" => "integer"
"University Rating" => "integer"
"SOP" => "float"
"LOR" => "float"
"CGPA" => "float"
"Research" => "integer"
"Chance of Admit" => "float"
}
remove_field => ["message"]
separator =>","
skip_header =>"true"
}

}
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
document_type => "_doc"
user => elastic
password => changeme
index => "cars"
}
stdout {
codec => rubydebug
}
}

However I don't see any kind of error.
But there is nothing stored in es. I doubt if logstash is even reading it.
if it is , isnt it supposed to output in console?
How would it output in console a csv file, any thing I need to do to convert to json.
But main issue is nothing is stored in elastic search.

partial logs

[2019-05-27T15:24:01,182][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}

[2019-05-27T15:24:01,214][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2019-05-27T15:24:01,242][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-05-27T15:24:01,242][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-05-27T15:24:01,316][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://elasticsearch:9200"]}
[2019-05-27T15:24:01,320][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, :thread=>"#<Thread:0x1537d2eb run>"}
[2019-05-27T15:24:01,459][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2019-05-27T15:24:01,528][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:main, :".monitoring-logstash"], :non_running_pipelines=>}
[2019-05-27T15:24:02,661][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-05-27T15:30:07,456][WARN ][logstash.runner ] SIGTERM received. Shutting down.
[2019-05-27T15:30:07,804][INFO ][filewatch.observingtail ] QUIT - closing all files and shutting down.
[2019-05-27T15:30:08,581][INFO ][logstash.javapipeline ] Pipeline terminated {"pipeline.id"=>"main"}
[2019-05-27T15:30:08,840][INFO ][logstash.javapipeline ] Pipeline terminated {"pipeline.id"=>".monitoring-logstash"}
[2019-05-27T15:30:09,430][INFO ][logstash.runner ] Logstash shut down.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.