Logstash::ConfigurationError-message

Hello everyone,

i am uploading a csv file with the following code :

###############################################################################################################

input
{
file
{
path => "C:/Users/BEKRISO/KIBANA7.0.1/INPUT/9r_piste_audit.csv"
start_position => "beginning"
sincedb_path => "C:/Users/BEKRISO/KIBANA7.0.1/sincedb"
}
}

############################################################################################################################

filter
{
csv
{
separator => ","

	columns => ["Date et heure","Utilisateur","Code","Libellé évènement","Code retour","Application","Code site","Objet Start","Usage cache","Valeur avant modif","Valeur après modif"]
	
	mutate { convert => { "Date et heure" => "keyword" }}
	mutate { convert => { "Utilisateur" => "keyword" }}
	mutate { convert => { "Code" => "keyword" }}
	mutate { convert => { "Libellé évènement" => "keyword" }}
	mutate { convert => { "Code retour" => "keyword" }}
	mutate { convert => { "Application" => "keyword" }}
	mutate { convert => { "Code site" => "keyword" }}
	mutate { convert => { "Objet Start" => "keyword" }}
	mutate { convert => { "Usage cache" => "keyword" }}						
	mutate { convert => { "Valeur avant modif" => "keyword" }}
	mutate { convert => { "Valeur après modif" => "keyword"	}}
					
}

}

##############################################################################################################################

output
{
elasticsearch
{
hosts => "cas0000658713:9200"
index => "monbeaunode_1"
}

stdout {}

}
############################################################################################################################

when i run this code, i receive the following error :

[2019-05-29T16:50:13,482][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 23, column 10 (byte 673) after filter\r\n{\r\n\tcsv\r\n\t{\r\n\t\tseparator => ","\r\n\t\t\r\n\t\tcolumns => ["Date et heure","Utilisateur","Code","Libell├® ├®v├¿nement","Code retour","Application","Code site","Objet Start","Usage cache","Valeur avant modif","Valeur apr├¿s modif"]\r\n\t\t\r\n\t\tmutate ", :backtrace=>["C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/compiler.rb:49:incompile_graph'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2577:inmap'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:ininitialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:23:ininitialize'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/agent.rb:325:inblock in converge_state'"]}

can anyone pls tell me what is the problem?

You are confusing the convert option to the csv filter with a mutate filter. So you can use either

filter {
    csv {
        separator => ","
        columns => ["Date et heure","Utilisateur","Code","Libellé évènement","Code retour","Application","Code site","Objet Start","Usage cache","Valeur avant modif","Valeur après modif"]
        convert => { "Date et heure" => "string" }
       [...]
    }

or

filter {
    csv {
        separator => ","
        columns => ["Date et heure","Utilisateur","Code","Libellé évènement","Code retour","Application","Code site","Objet Start","Usage cache","Valeur avant modif","Valeur après modif"]
    }
    mutate {
        convert => { "Date et heure" => "string" }
       [...]
    }

Note also that "keyword" is not a valid conversion. The fields will most likely be string by default so the convert is probably not needed. The convert option expects a hash, so if I were doing multiple conversions I would use

filter {
    csv {
        columns => [ ...]
        convert => {
            "someNumber" => "integer"
            "column2" => "boolean"
        }
        [...]

If you specify an option to a filter more than once then most of the time it will do what you want, but sometimes it will do something else and confuse the hell out of you.

Thank you for your reply,
I have made some changes but I am still receiving some ERRORS,

####CODE#####

input
{
file
{
path => "C:/Users/BEKRISO/KIBANA7.0.1/INPUT/9r_piste_audit.csv"
start_position => "beginning"
sincedb_path => "C:/Users/BEKRISO/KIBANA7.0.1/sincedb"
}
}

############################################################################################################################

filter
{
csv
{
separator => ","

	columns => ["Date et heure","Utilisateur","Code","Libellé évènement","Code retour","Application","Code site","Objet Start","Usage cache","Valeur avant modif","Valeur après modif"]
	
	convert => { "Date et heure" => "date" }
	convert => { "Utilisateur" => "string" }
	convert => { "Code" => "integer" }
	convert => { "Libellé évènement" => "string" }
	convert => { "Code retour" => "integer" }
	convert => { "Application" => "string" }
	convert => { "Code site" => "integer" }
	convert => { "Objet Start" => "string" }
	convert => { "Usage cache" => "boolean" }						
	convert => { "Valeur avant modif" => "string" }
	convert => { "Valeur après modif" => "string"	}
					
}

}

##############################################################################################################################

output
{
elasticsearch
{
hosts => "cas0000658713:9200"
index => "monbeaunode_1"
}

stdout {}

}

#####ERRORS#####

[2019-05-31T14:27:23,191][ERROR][logstash.javapipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: Invalid conversion types: string>, :backtrace=>["C:/Users/BEKRISO/KIBANA7.0.1/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-csv-3.0.10/lib/logstash/filters/csv.rb:111:in register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:56:inregister'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:191:in block in register_plugins'", "org/jruby/RubyArray.java:1792:ineach'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:190:in register_plugins'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:446:inmaybe_setup_out_plugins'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:203:in start_workers'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:145:inrun'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:104:in `block in start'"], :thread=>"#<Thread:0x15f8363 run>"}
[2019-05-31T14:27:23,238][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}

Can you post exemplary csv line you have? I will try to parse it.

Date et heure,Utilisateur,Code,Libellé évènement,Code retour,Application,Code site,Objet Start,Usage cache,Valeur avant modif ,Valeur après modif
27/05/2019 10:54,,1,Appel à la passerelle par une application cliente,0,9R,990,TA-ESHMA-0,NON,,
27/05/2019 10:12,,1,Appel à la passerelle par une application cliente,0,9R,990,TA-ESHMA-0,NON,,

string is not a supported conversion type for a csv filter. It is supported for a mutate filter.

with the following code :

input
{
file
{
path => "C:/Users/BEKRISO/KIBANA7.0.1/INPUT/9r_piste_audit.csv"
start_position => "beginning"
sincedb_path => "C:/Users/BEKRISO/KIBANA7.0.1/sincedb"
}
}

############################################################################################################################

filter
{
csv
{
separator => ","

	columns => ["Date et heure","Utilisateur","Code","Libellé évènement","Code retour","Application","Code site","Objet Start","Usage cache","Valeur avant modif","Valeur après modif"]
}

	
mutate{

	convert => { 
		
		"Date et heure" => "float" 
		"Utilisateur" => "string" 
		"Code" => "integer" 
		"Libellé évènement" => "string" 
		"Code retour" => "integer" 
		"Application" => "string" 
		"Code site" => "integer" 
		"Objet Start" => "string" 
		"Usage cache" => "sring" 						
		"Valeur avant modif" => "string" 
		"Valeur après modif" => "string"	
	
	}
					
}

}

##############################################################################################################################

output
{
elasticsearch
{
hosts => "cas0000658713:9200"
index => "monbeaunode_1"
}

stdout {}

}

I still receive the ERROR

[2019-05-31T17:11:25,312][ERROR][logstash.javapipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["C:/Users/BEKRISO/KIBANA7.0.1/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-mutate-3.4.0/lib/logstash/filters/mutate.rb:219:in block in register'", "org/jruby/RubyHash.java:1419:ineach'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-mutate-3.4.0/lib/logstash/filters/mutate.rb:217:in register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:56:inregister'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:191:in block in register_plugins'", "org/jruby/RubyArray.java:1792:ineach'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:190:in register_plugins'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:446:inmaybe_setup_out_plugins'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:203:in start_workers'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:145:inrun'", "C:/Users/BEKRISO/KIBANA7.0.1/logstash/logstash-core/lib/logstash/java_pipeline.rb:104:in `block in start'"], :thread=>"#<Thread:0x15dc400 run>"}
[2019-05-31T17:11:25,327][INFO ][logstash.outputs.elasticsearch] Index Lifecycle Management is set to 'auto', but will be disabled - Index Lifecycle management is not installed on your Elasticsearch cluster
[2019-05-31T17:11:25,327][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-05-31T17:11:25,359][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}

That should be string, not sring.

the problem is solved, thank you so much

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.