Mutate string to geo point error (logstash)


(Claudio Gerarduzzi) #1

Good morning community;

I have the next problem in my exercise, with the next config file:
input {
file {
path => "/EADIC/Chicago_Crime_Data_2015.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
csv{
separator => ","
columns => ["ID","Case Number","Date","Block","IUCR","Primary Type","Description","Location Description","Arrest","Domestic","Beat","District","Ward","Community Area","FBI Code","X Coordinate","Y Coordinate","Year","Updated On","Latitude","Longitude","Location"]
}
mutate {
convert => { "Location" => "geo_point" }
}
}
output{
elasticsearch{
hosts => "localhost"
index => "perejil"

}
stdout{ }

}

I have the next error message:
[2018-11-04T10:45:39,827][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-11-04T10:45:39,885][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0xc17a565 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="3e47c60d5fa952df4b5ba019dd148fda7b20e3559a37a1b0d48044a8275a4ca1", @klass=LogStash::Filters::Mutate, @metric_events=#LogStash::Instrument::NamespacedMetric:0x3cb144c4, @filter=<LogStash::Filters::Mutate convert=>{"Location"=>"geo_point"}, id=>"3e47c60d5fa952df4b5ba019dd148fda7b20e3559a37a1b0d48044a8275a4ca1", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :thread=>"#<Thread:0x421877d5 run>"}
[2018-11-04T10:45:39,901][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["C:/EADIC/logstash-6.4.2/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.3/lib/logstash/filters/mutate.rb:219:in block in register'", "org/jruby/RubyHash.java:1343:ineach'", "C:/EADIC/logstash-6.4.2/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.3/lib/logstash/filters/mutate.rb:217:in register'", "C:/EADIC/logstash-6.4.2/logstash-core/lib/logstash/pipeline.rb:242:inregister_plugin'", "C:/EADIC/logstash-6.4.2/logstash-core/lib/logstash/pipeline.rb:253:in block in register_plugins'", "org/jruby/RubyArray.java:1734:ineach'", "C:/EADIC/logstash-6.4.2/logstash-core/lib/logstash/pipeline.rb:253:in register_plugins'", "C:/EADIC/logstash-6.4.2/logstash-core/lib/logstash/pipeline.rb:595:inmaybe_setup_out_plugins'", "C:/EADIC/logstash-6.4.2/logstash-core/lib/logstash/pipeline.rb:263:in start_workers'", "C:/EADIC/logstash-6.4.2/logstash-core/lib/logstash/pipeline.rb:200:inrun'", "C:/EADIC/logstash-6.4.2/logstash-core/lib/logstash/pipeline.rb:160:in `block in start'"], :thread=>"#<Thread:0x421877d5 run>"}
[2018-11-04T10:45:39,924][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}
[2018-11-04T10:45:40,304][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

C:\EADIC\logstash-6.4.2\bin>logstash -f c:\eadic\logstash-6.4.2\config\chicago_crime.conf
Sending Logstash logs to C:/EADIC/logstash-6.4.2/logs which is now configured via log4j2.properties
[2018-11-04T10:50:44,305][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-11-04T10:50:45,359][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.2"}
[2018-11-04T10:50:45,500][INFO ][logstash.config.source.local.configpathloader] No config files found in path {:path=>"c:/eadic/logstash-6.4.2/config/chicago_crime.conf"}
[2018-11-04T10:50:45,554][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.
[2018-11-04T10:50:46,270][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

I were checking the differente otion and seems to be on this way.

The index was created only with the basic (number of shard and number of replicas).

Can you help me??? thanks
Claudio


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.