Logstash with multiple config files failed to execute action

Hi,
I'm running ELK in production environment on Ubuntu. Everything worked well when I try to index only one csv data with only one logstash config file in the directory /etc/logstash/conf.d

Problems started when I added a second logstash config files to index a second csv data. First, I noticed that Logstash mixes output when using more than one config files. I did some researches on google that drive me to this solution on stackoverflown. But when I tried it, now logstash failed to execute action.

This is one of my config files.

input {
    file {
    path => "/home/bola/Documents/Duniya/csv/VarietesCommune.csv"
    start_position => beginning
    sincedb_path => "/dev/null"
    type => "doc1"
    codec => plain {
            charset => "ISO-8859-1"
        }
    }
}

if [type] == "doc1" {
    filter {
        csv {
            columns =>  ["Saison","Departements","Communes "," Bema14_J07","Bema14_B09","Bema14_J08","  Bema_B10","TzeComposite3DtBenin","Bag97_TzeComposite3x4_Benin","Bema14_J15","  Bema00_J20","DtSrWBenin","IwDsynWBenin"," TzpbSrWBenin ","Bema10_B05","FaabaQpm","EvDt97_StrWBenin","DmrEsrWBenin","DmrEsrQpmWBenin","Ak94_DmrEsrYBenin","2000_SynEeWBenin","TzeeSrWBenin","Bema14_B05","VarietesPossibles","MeilleureVariete"]
            separator => ";"
        }
    }

    output {
            elasticsearch {
            hosts => ["localhost:9200"]
            action => "index"
            index => "varietesmaiscommune"  
        }
        stdout { codec => rubydebug } 
    }
}

I got this error in my logstash log file.

[2018-10-09T12:35:39,897][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-10-09T12:36:15,719][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.4.0"}
[2018-10-09T12:36:16,755][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 13, column 1 (byte 246) after ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:157:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}

Could anyone help me to fix it please ? I can't index my data one by one because they are multiple. Is there any other way to do it ? thanks !

Hello, refer to this link as shown, which may help you,

https://discuss.elastic.co/t/load-csv-file-in-logstash/151319/2

Hi @balumurari1. Thanks for your answer. I read a while ago that sincedb_path => "NUL" is more specific for windows. Anyway, I tried it but it didn't fix the problem.

filter {
if [type] == "doc1" {
csv {
.......
}
}
}

This will work for you.

1 Like

Thank you @Sripal. It fixed the problem. I also did same thing for output.

congrats.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.