Mapper_parsing_exception & failed to parse

I'm trying to use Json filter plugin | Logstash Reference:

filter {
  json {
    skip_on_invalid_json => true
    source => "message"
  }
}

logstash log - getting bombarded with similar messages:

logstash11 | [2018-02-20T03:23:10,028][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2018.02.20", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x62f0c359>], :response=>{"index"=>{"_index"=>"logstash-2018.02.20", "_type"=>"doc", "_id"=>"Qos8sWEBfs5EPH_FiQVE", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [level]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"DEBUG\""}}}}}

via Dev Tools:

GET /logstash-2018.02.20/doc/Qos8sWEBfs5EPH_FiQVE

{
  "_index": "logstash-2018.02.20",
  "_type": "doc",
  "_id": "Qos8sWEBfs5EPH_FiQVE",
  "found": false
}

  1. at least one of the issue here is due to trying to save string instead of numerical value, hence warning.
  2. the other issue is due to the fact that messages are being dropped and never making into elasticsearch - is it outcome of first issue?

Please advise.

  1. If the level field has been mapped as a number and you're trying to index documents where that field contains a non-number then that's certainly the cause of this.
  2. Yes. Events will be dropped unless you've configured the dead letter queue feature.
  1. Ok, I was right)
  2. Dead Letter Queues | Logstash Reference [6.2] | Elastic)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.