Pipeline aborted due to error help me

i'm working with logstash the current version 7.6
i have 44 columns , 4 of them are date but contains nothing
i declared them as a data it's not accepted
i didn't dclare them still not accepted ( cuz i know that it's taken as a string default)
so as a solution i'm trying to force them as a string
i tried this code but an error is shwon help

input{
file{
path => "/home/user/data/test-536.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
csv{
separator => ","
columns => ["Supervision Period Warning Played Flag",.......,"Community Id 3",]

	}
    mutate {
		convert => {
		"Supervision Period Warning Played Flag" => "integer"
	         .............
		"Community Id 3" => "integer"
		}
	}
	mutate {
            convert => { "Negative Balance Barring Start Date" => "string" }
            convert => { "Last Service Fee Deduction Date" => "string"}
	}
			
	mutate {
		add_field => {"D"=> ["2020-01-01"]}
	}
	
	date {	
		match => [ "Supervision Period Expiry Date", "YYYY-MM-dd" ]
		target => "Supervision Period Expiry Date"
	}

}
output{
elasticsearch{
hosts => "localhost"
index => "sub1"
document_type => "subscriber"
}
stdout{}
}

and my error is :
[2020-02-25T11:55:29,007][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-02-25T11:55:29,092][ERROR][logstash.javapipeline ][main] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["/home/samar/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-mutate-3.5.0/lib/logstash/filters/mutate.rb:222:in block in register'", "org/jruby/RubyHash.java:1428:in each'", "/home/samar/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-mutate-3.5.0/lib/logstash/filters/mutate.rb:220:in register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:56:in register'", "/home/samar/logstash/logstash-core/lib/logstash/java_pipeline.rb:200:in block in register_plugins'", "org/jruby/RubyArray.java:1814:in each'", "/home/samar/logstash/logstash-core/lib/logstash/java_pipeline.rb:199:in register_plugins'", "/home/samar/logstash/logstash-core/lib/logstash/java_pipeline.rb:502:in maybe_setup_out_plugins'", "/home/samar/logstash/logstash-core/lib/logstash/java_pipeline.rb:212:in start_workers'", "/home/samar/logstash/logstash-core/lib/logstash/java_pipeline.rb:154:in run'", "/home/samar/logstash/logstash-core/lib/logstash/java_pipeline.rb:109:in `block in start'"], "pipeline.sources"=>["/home/samar/logstash/config/test.config"], :thread=>"#<Thread:0x29692259 run>"}
[2020-02-25T11:55:29,149][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}
[2020-02-25T11:55:29,217][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}

Hi

Try commenting out the second mutate{}, where you convert date fields to string, and all the date{} plugin instances, and see what you get from stdoutput{}.

You can also comment out your elasticsearch{} output for now (no need to send anything there until you get it right).

I'm thinking you won't need this instance of mutate{} because your data is already string out of the csv{} filter. Then you'll have to "protect" your date{} instances with if statements to check for empty fields. But I'd start, as I suggested above, by checking what you get without filtering the data.

Hope this helps.

okay thank u
but how can i deal with this emty fields that i have ?
do i need to set a condition on it or what ?

Hi

Exactly. The date{} plugin will "crash" if the field is empty, so you have to wrap it inside an if statement that will ensure date{} is called only iff the field is not empty.

You can read about conditionals here: https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#conditionals, and see some examples here: https://www.elastic.co/guide/en/logstash/current/config-examples.html#using-conditionals

Hope this helps.

well i'm trying to find a condition on empty fields but nothing is working
i found
csv {
source => "message"
}
if ! [""] {
mutate {
update => { "" => " " }
}
}
i tried it but it's not working

mutate does not support "long" as a type it can convert to.

i found the solution ( add the ruby filter to see if it's empty field it will remove it else if it's full it will keep it )
input{
file{
path => "/home/user/data/test-536-Copie.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
csv{
separator => ","
columns => ["Service Fee Period Warning Played Flag",........,"Community Id 3"]

	}
	ruby {
            code => "
                    hash = event.to_hash
                    hash.each do |k,v|
                            if v == nil
                                    event.remove(k)
                            end
                    end
            "
    }
	mutate {
		convert => {
		"Supervision Period Warning Played Flag" => "integer"
                     ..........
		"Community Id 3" => "integer"
		}
	}
	mutate {
		add_field => {"D"=> ["2020-01-01"]}
	}
	date {	
		match => [ "Supervision Period Expiry Date", "YYYY-MM-dd" ]
		target => "Supervision Period Expiry Date"
	}
	
	
}

output{
elasticsearch{
hosts => "localhost"
index => "sub122"
document_type => "subscriber"
}
stdout{codec => rubydebug }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.