Error thrown during bulk uploading dataset using Logstash

Hi Everyone,

I am new to Logstash and have been trying to learn ELK since last couple of days.

I am trying to bulk upload a dataset to elasticsearch using logstash, however i am getting below error.

Here is the logs that i am getting under

[2018-09-08T08:22:06,723][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-09-08T08:22:07,886][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.0"}
[2018-09-08T08:22:09,511][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, } at line 16, column 23 (byte 433) after filter {\n\tcsv {\n\tseparator => ","\n\tcolumns =>[ "maker", "model", "mileage", "manufacture_year", "engine_displacement", "engine_power", "body_type", "color_slug", "stk_year", "transmission", "door_count", "seat_count", "fuel_type", "date_created", "date_last_seen", "price_eur"]\n\t}\n\n\tmutate {\n\tconvert => {\n\t\t"mileage" => integer", :backtrace=>["C:/logstash/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "C:/logstash/logstash-core/lib/logstash/compiler.rb:49:incompile_graph'", "C:/logstash/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "C:/logstash/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:157:ininitialize'", "C:/logstash/logstash-core/lib/logstash/pipeline.rb:22:in initialize'", "C:/logstash/logstash-core/lib/logstash/pipeline.rb:90:ininitialize'", "C:/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in execute'", "C:/logstash/logstash-core/lib/logstash/agent.rb:309:inblock in converge_state'"]}
[2018-09-08T08:22:10,512][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Here is my config file,

Not sure whats going wrong here, but the upload doesn't start. It would be great if someone can help.

Thanks,
Rammohan B.

Hi There,

Can somebody please help here? Let me know if this is not the right forum to discuss this kind of issues.

Thanks,
Rammohan B.

The error states that your pipeline configuration is incorrect. This means you may have a syntax or other error in your pipleine. The hint is in this log line

[2018-09-08T08:22:09,511][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, } at line 16, column 23 (byte 433) after filter {\n\tcsv {\n\tseparator => ","\n\tcolumns =>[ "maker", "model", "mileage", "manufacture_year", "engine_displacement", "engine_power", "body_type", "color_slug", "stk_year", "transmission", "door_count", "seat_count", "fuel_type", "date_created", "date_last_seen", "price_eur"]\n\t}\n\n\tmutate {\n\tconvert => {\n\t\t"mileage" => integer",

If you look at the end of this line, it is also printing the part of pipleine which has the error. Looks like you have a mutate filter where you are missing a " in the "mileage" => integer" line.

1 Like

Thank you so much for the reply @Dheeraj_Gupta.

I corrected those errors, however my command prompt is stopped at a place after displaying the below message and doesn't move forward,

[2018-09-10T14:17:06,975][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-09-10T14:17:08,546][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.4.0"}
[2018-09-10T14:17:19,290][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-09-10T14:17:20,591][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-09-10T14:17:20,610][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-09-10T14:17:20,981][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-09-10T14:17:21,060][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-09-10T14:17:21,068][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-09-10T14:17:21,117][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2018-09-10T14:17:21,157][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-09-10T14:17:21,243][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-09-10T14:17:22,282][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0xc9c20a9 run>"}
[2018-09-10T14:17:22,434][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-09-10T14:17:22,459][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2018-09-10T14:17:23,000][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Can you please suggest what am i doing wrong here?

Note: There was another error which i was received saying that 'dev\null' doesn't exists, I created a 'dev' folder having a file with filename as 'null' without any extension just to make it work. I hope that expected.

Thanks,
Rammohan B.

In the logs there is no entry like "Opening file cars.csv" so my guess is logstash cannot find your input file. Are you sure cars.csv exists in your current directory? Relative paths are tricky and I don't have much experience with using those with logstash. Maybe you can switch to absolute paths?

For /dev/null, this is a special null file in linux, you don't need to create anything to access it. I am not sure you can specify this as sincedb_path.

Hi @Dheeraj_Gupta

I did thought that could be an issue considering the error that i got, the above error message was with the absolute path of csv file itself.

Below is the error that i got with relative path of the file,

[2018-09-10T15:28:14,511][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-09-10T15:28:15,769][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.4.0"}
[2018-09-10T15:28:23,120][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-09-10T15:28:23,732][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-09-10T15:28:23,745][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-09-10T15:28:24,087][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-09-10T15:28:24,177][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-09-10T15:28:24,177][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-09-10T15:28:24,217][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2018-09-10T15:28:24,247][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-09-10T15:28:24,267][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-09-10T15:28:25,088][ERROR][logstash.pipeline        ] Error registering plugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::File start_position=>\"beginning\", path=>[\"cars.csv\"], id=>\"882ab898345c77e2d2dfb1a70981b4329a808edaabbf190f6ee058234e7b4f09\", sincedb_path=>\"/dev/null\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_d4c9e5cb-27a9-408b-a638-949060522f4a\", enable_metric=>true, charset=>\"UTF-8\">, stat_interval=>1.0, discover_interval=>15, sincedb_write_interval=>15.0, delimiter=>\"\\n\", close_older=>3600.0, mode=>\"tail\", file_completed_action=>\"delete\", sincedb_clean_after=>1209600.0, file_chunk_size=>32768, file_chunk_count=>140737488355327, file_sort_by=>\"last_modified\", file_sort_direction=>\"asc\">", :error=>"File paths must be absolute, relative path specified: cars.csv", :thread=>"#<Thread:0x1661e610 run>"}
[2018-09-10T15:28:25,217][ERROR][logstash.pipeline        ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<ArgumentError: File paths must be absolute, relative path specified: cars.csv>, :backtrace=>["C:/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-file-4.1.5/lib/logstash/inputs/file.rb:269:in `block in register'", "org/jruby/RubyArray.java:1734:in `each'", "C:/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-file-4.1.5/lib/logstash/inputs/file.rb:267:in `register'", "C:/logstash/logstash-core/lib/logstash/pipeline.rb:241:in `register_plugin'", "C:/logstash/logstash-core/lib/logstash/pipeline.rb:252:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "C:/logstash/logstash-core/lib/logstash/pipeline.rb:252:in `register_plugins'", "C:/logstash/logstash-core/lib/logstash/pipeline.rb:395:in `start_inputs'", "C:/logstash/logstash-core/lib/logstash/pipeline.rb:293:in `start_workers'", "C:/logstash/logstash-core/lib/logstash/pipeline.rb:199:in `run'", "C:/logstash/logstash-core/lib/logstash/pipeline.rb:159:in `block in start'"], :thread=>"#<Thread:0x1661e610 run>"}
[2018-09-10T15:28:25,239][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2018-09-10T15:28:25,739][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Note: I have Config file & CSV Data within the same folder, whereas Logstash is installed in another drive.

Hey @Dheeraj_Gupta, I think i figured out the issue. I had to use forward slash instead of backward slash to make it work. Thank you so much for your time.

Thanks,
Rammohan B.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.