Configure a list of integers with Logstash

Hi,
i want to load datasets (csv file) to elasticseach viaa logstash,this is a snippet of my data sets

transaction_date,customer_name,age,job,fidelity_card_id,product_id_sold,

2/24/2017;14:15:43,Roberto Michel,25,Analyst Programmer,4.17500359418653E+015,"431, 1032, 197, 684, 201, 206, 1022, 922, 556, 677","4, 9, 8, 5, 2, 3, 10, 67, 1"
5/15/2016;10:51:44,Reiko Branchet,22,Engineer III,374283539519835,"680, 495, 584, 161, 972, 1045, 615, 15, 788, 579","67, 1, 4, 2, 8, 9, 5, 3, 10"

the two last fields are lists of integers ,i want to configure them with logstash as a list but i don't know how .i tried to use

mutate {

convert=>{"[quantity]"=>"[integer]"}

}

this is the error
Error registering plugin {:plugin=>"#<LogStash::FilterDelegator:0x1e22751a @id="1840e7e66654467e4855f4a7e3f7cab6e8ecc2e8-5", @klass=LogStash::Filters::Mutate, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x78f7d852 @metric=#<LogStash::Instrument::Metric:0x6978def3 @collector=#<LogStash::Instrument::Collector:0x8f987f9 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x57eb959 @store=#<Concurrent::map:0x00000000061fc4 entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x430dd4ec, @fast_lookup=#<Concurrent::map:0x00000000061fc8 entries=63 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :"1840e7e66654467e4855f4a7e3f7cab6e8ecc2e8-5", :events]>, @logger=#<LogStash::Logging::Logger:0x28af810b @logger=#Java::OrgApacheLoggingLog4jCore::Logger:0xdcd1c90>, @filter=<LogStash::Filters::Mutate convert=>{"age"=>"integer", "[quantity]"=>"[integer]"}, id=>"1840e7e66654467e4855f4a7e3f7cab6e8ecc2e8-5", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register"}
[2017-08-07T12:49:10,469][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["/opt/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.5/lib/logstash/filters/mutate.rb:189:in register'", "org/jruby/RubyHash.java:1342:ineach'", "/opt/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.5/lib/logstash/filters/mutate.rb:183:in register'", "/opt/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:281:inregister_plugin'", "/opt/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:292:in register_plugins'", "org/jruby/RubyArray.java:1613:ineach'", "/opt/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:292:in register_plugins'", "/opt/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:302:instart_workers'", "/opt/logstash-5.5.0/logstash-core/

This log entry is the interesting one:

[2017-08-07T12:49:10,469][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["/opt/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.5/lib/logstash/filters/mutate.rb:189:in register'", "org/jruby/RubyHash.java:1342:ineach’", “/opt/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.5/lib/logstash/filters/mutate.rb:183:in register'", "/opt/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:281:inregister_plugin’”, “/opt/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:292:in register_plugins'", "org/jruby/RubyArray.java:1613:ineach’”, “/opt/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:292:in register_plugins'", "/opt/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:302:instart_workers’”, "/opt/logstash-5.5.0/logstash-core/

Since you didn't format the posted log as preformatted text it has been mangled, but look in the original log and you'll find this:

Invalid conversion type '[integer]', expected one of 'string, integer, float, boolean'

To turn the string "67, 1, 4, 2, 8, 9, 5, 3, 10" into an array of those integers, use a mutate filter and its split option.

thank you for your reply
i used mutate in the config file as bellow

filter {
csv {
separator => ","
#transaction_date,customer_name,age,job,fidelity_card_id,product_id_sold,quantity
columns => ["transaction_date","customer_name","age","job","fidelity_card_id","product_id_sold","quantity"]
}

date {
match => [ "transaction_date", "M/d/yyyy;HH:mm:ss" ]
target => "transaction_date"

    }

mutate {
convert =>{"age" => "integer" }
split => { "quantity" => "," }
split => { "product_id_sold" => ","}
}
}

the fields are still strings have a look at the mapping in kibana

so they didn't turn into array of integers i also had this warn when loading data with logstash

[WARN ][logstash.filters.csv ] Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>}
2017-08-08T08:10:19.855Z vds002.insightscale.tn transaction_date,customer_name,age,job,fidelity_card_id,product_id_sold,quantity

maybe the problem maintains in the format of the csv file !!!

this is the JSON document
"product_id_sold": [
"723",
" 334",
" 297",
" 666",
" 182",
" 530",
" 978",
" 133",
" 229",
" 281"
],

so now i only need to covert the string value into integers ,i tryed it with convert but it did not work

Please show your current configuration where you try to convert the elements of product_id_sold to integers.

filter {
csv {
separator => ","
#transaction_date,customer_name,age,job,fidelity_card_id,product_id_sold,quantity
columns => ["transaction_date","customer_name","age","job","fidelity_card_id","product_id_sold","quantity"]
}

date {
match => [ "transaction_date", "M/d/yyyy;HH:mm:ss" ]
target => "transaction_date"

    }

mutate {
convert =>{"age" => "integer" }
split => { "quantity" => "," }
split => { "product_id_sold" => ","}
convert => {"product_id_sold" => "integer"}
}
}

how should i fix this ?

Mutate actions aren't applied in the order listed in the configuration file. Their execution order is fixed, and it so happens that convert is applied before split. So, split your mutate filter into at least two to make sure the split takes place before you attempt the conversion.

thank u a lot it worked for me

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.