Logstash: Could not index event to Elasticsearch

I have many many errors in Logstash log:

[2019-04-09T11:24:48,037][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"applicationlog-nolevel-2019.04.09", :_type=>"applicationLog", :routing=>nil}, #LogStash::Event:0x23524674], :response=>{"index"=>{"_index"=>"applicationlog-nolevel-2019.04.09", "_type"=>"applicationLog", "_id"=>"RdbXAWoBMa_P-s105po6", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:9"}}}}}

I understood, that there is an existing template in ES, but something goes wrong - log`s messages contains a new type of [host].

I enabled dead_message in logstash.yml:

dead_letter_queue.enable: true

And try to analyze "bad" messages:

Could not index event to Elasticsearch.
status: 400,
action: ["index", {:_id=>nil, :_index=>"applicationlog-nolevel-2019.04.09",
:_type=>"applicationLog", :routing=>nil}, #LogStash::Event:0x52502ed4],
response: {"index"=>{"_index"=>"applicationlog-nolevel-2019.04.09",
"_type"=>"applicationLog", "_id"=>"-PQuAmoB8kmt6Ez0MzkO", "status"=>400,
"error"=>{"type"=>"mapper_parsing_exception",
"reason"=>"failed to parse [host]",
"caused_by"=>{"type"=>"illegal_state_exception",
"reason"=>"Can't get text on a START_OBJECT at 1:69"}}}}c> t�����rk
2019-04-09T12:59:03.759Z
��qjava.util.HashMap�dDATA�x
org.logstash.ConvertedMap�j@timestamp�vorg.logstash.Timestampx
2019-04-09T12:58:59.919Z�kapplication�torg.jruby.RubyStringcAAA�dhost�x
org.logstash.ConvertedMap�dname�torg.jruby.RubyStringoMY-HOST1���jinstanceId�torg.jruby.RubyStringoMY-HOST1�elevel�torg.jruby.RubyStringgNOLEVEL�vapplicationEnvironment�torg.jruby.RubyStringbQA�gmessage�torg.jruby.RubyStringx2019-04-09 08:58:59,871

INFO [stdout] (Thread-231 (client-global-threads-1377496603))

Priority: 9 Priority Level: high�dtags�x
org.logstash.ConvertedList��torg.jruby.RubyStringx
beats_input_codec_plain_applied��torg.jruby.RubyStringq_grokparsefailure���clog�x
org.logstash.ConvertedMap�dfile�x
org.logstash.ConvertedMap�dpath�torg.jruby.RubyStringt/AAA_LOGS/server.log�����h@version�torg.jruby.RubyStringa1���dMETA�x
org.logstash.ConvertedMap�dtype�torg.jruby.RubyStringnapplicationLelasticsearch

In my ES template, CAT /_template there is a type of host_ip:

          "host_ip": {
            "type": "ip"
          },

To clarify this issue: is it correct, that Logstash try to index message with another type of field?

(Was) "ip" => (try to index) "string" ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.