Mapping error while writing to index

Hello together,

i am trying to read, with logstash, from an influxdb via http-poller, that seems to work when writing to an file, but when i write to an index i get the following error:

[2020-12-18T08:59:14,281][WARN ][logstash.outputs.elasticsearch][pipeline-influx][fcc50545203dc122f4f424dcfa67856254236a5f68502e79e4186915fd06e02f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"powermax", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x78e0add1>], :response=>{"index"=>{"_index"=>"powermax-2020.12.18-000001", "_type"=>"_doc", "_id"=>"dpHcdHYBSaOtOdw6KMDe", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"unknown parameter [norms] on mapper [values] of type [null]"}}}}

Here is my logstash-Config:

# Manage Read from influx DB

input {
  http_poller {
    urls => {
      test1 => "http://server:8086/query?pretty=true&db=inspectit&q=SELECT%20*%20FROM%20%22Pmax%22.%22autogen%22.%22Array%22"
    }
  schedule => { cron => "* * * * * UTC"}
  codec => "json"
  }
}

filter {
  split {
    field  => "results"
  }
  split {
    field  => "[results][series]"
  }
}

output {
   #stdout { codec => rubydebug }
   stdout { codec => json }
   #file {
   #  path => ["/var/log/logstash/debug_stream_pmax.log"]
   #  flush_interval => 500
   #  #codec => plain { charset => "UTF-8" }
   #  codec => json
   #}
   elasticsearch {
      ilm_rollover_alias => "pmax"
      hosts => ["https://host1:9200","https://host2:9200","https://host3:9200"]
      user => elastic
      password => "...."
      ssl => true
      cacert => "/etc/pki/CA/certs/....cer"
      codec => rubydebug
   }
}

File is working, index is not, what am i doing wrong?

Kind regards
Boris

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.