Error=>"translation missing:

logstash config:

input {
file {
path => "C:/Pentaho/Workspace/ElasticSearch/logstash/in/sprzedaz.csv"
start_position => "beginning"
sincedb_path => "nul"
}
}
filter{
csv{
separator => ";"
columns => ["SF_type", "product_id", "product_category", "times", "product_description", "product_price", "product_quantity", "client_name", "client_id", "NSM_name", "NSM_no", "AM_name", "AM_no", "postal_code", "location"]
}
mutate {
convert => {
"SF_type" => "long"
"product_id" => "long"
"product_category" => "text"
"times" => "text"
"product_description" => "text"
"product_price" =>"float"
"product_quantity" => "long"
"client_name" => "text"
"client_id" => "long"
"NSM_name" => "text"
"NSM_no" => "long"
"AM_name" => "text"
"AM_no" => "long"
"postal_code" => "text"
"location" => "geo_point"
}
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "analiza_sprzedazy_lipiec"
}
stdout {}
}

And error that says really nothing:

[2018-08-11T23:55:07,241][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-08-11T23:55:07,520][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-08-11T23:55:07,594][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-08-11T23:55:07,602][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-08-11T23:55:07,653][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-08-11T23:55:07,700][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-08-11T23:55:07,777][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-08-11T23:55:07,907][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x4fe41ab9 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="8c08a6f24123f807ee33509c655a704af2c1e3bd357486f6c7081eb84ffd4ca6", @klass=LogStash::Filters::Mutate, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x5f1b9694 @metric=#<LogStash::Instrument::Metric:0x7a19a2b2 @collector=#<LogStash::Instrument::Collector:0x5ad42433 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x358adc02 @store=#<Concurrent::map:0x00000000000fb0 entries=2 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x7a9332e9, @fast_lookup=#<Concurrent::map:0x00000000000fb4 entries=65 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :"8c08a6f24123f807ee33509c655a704af2c1e3bd357486f6c7081eb84ffd4ca6", :events]>, @filter=<LogStash::Filters::Mutate convert=>{"SF_type"=>"long", "product_id"=>"long", "product_category"=>"text", "times"=>"text", "product_description"=>"text", "product_price"=>"float", "product_quantity"=>"long", "client_name"=>"text", "client_id"=>"long", "NSM_name"=>"text", "NSM_no"=>"long", "AM_name"=>"text", "AM_no"=>"long", "postal_code"=>"text", "location"=>"geo_point"}, id=>"8c08a6f24123f807ee33509c655a704af2c1e3bd357486f6c7081eb84ffd4ca6", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :thread=>"#<Thread:0x1d206d9c run>"}
[2018-08-11T23:55:07,958][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["C:/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.2/lib/logstash/filters/mutate.rb:219:in block in register'", "org/jruby/RubyHash.java:1343:ineach'", "C:/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.2/lib/logstash/filters/mutate.rb:217:in register'", "C:/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:340:inregister_plugin'", "C:/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:351:in block in register_plugins'", "org/jruby/RubyArray.java:1734:ineach'", "C:/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:351:in register_plugins'", "C:/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:729:inmaybe_setup_out_plugins'", "C:/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:361:in start_workers'", "C:/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:288:inrun'", "C:/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:248:in `block in start'"], :thread=>"#<Thread:0x1d206d9c run>"}
[2018-08-11T23:55:08,027][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}
[2018-08-11T23:55:08,624][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

"Error registering plugin" Which plugin?

"error=>"translation missing:" Which translation?

Looks like config is in 200% correct.

You cannot mutate+convert to a geo_point. You need an index template that defines the field as a geo_point. This thread might help you.

Nope. This is not a problem. Removed that line and still error.
This log is useless - explains nothing.

Problem is that logstash does not accept any other type than string, integer and float (what a jumble since in elasticearch string is "text" and it accepts other data types). Another mess about data...
If you use head you never expect that one module use different data types then other... both belonging to same software.:tired_face:
Another nonsense is to make special tick to play with geo_point.
So I have to first create mapping (even in template) then load CSV.
Suppose index (data from CSV) will be 'some_data_201807' then first I should create template:

PUT template/apache-template
{
"index_patterns" : ["some_data
*"],
"mappings": {
"doc": {
"properties": {
"geoip": {
"properties": {
"location": { "type": "geo_point" }
}
}
}
}
}
}

But what should be in CSV to be treated as geo_point during load?

headerA, headerB, location
john; doe; {"lat" : 51.04232, "lon" : 16.6166}
...

Something like that?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.