Logstash does not index event to elastic due to issue on host field

Hi all,

i'm having an issue with my logstash. I updated my infrastructure from elk 5.6 to elk 6.8 and now from elk 6.8 to elk 7.8. We are updating filebeats but logstash is reporting this error on metrics coming from the updated agents:

> [2020-12-09T16:18:37,881][WARN ][logstash.outputs.elasticsearch][main][86d5828b720e271f87c05b5155cc2e6bbf9b07e6dd48dad4987f094b18eb47e4] **Could not index event to Elasticsearch**. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2020.12.09", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x5b8252c1>], :response=>{"index"=>{"_index"=>"filebeat-2020.12.09", "_type"=>"_doc", "_id"=>"j34VSHYBtWLU2PFlNMg0", "status"=>400, "error"=>{**"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [keyword]** in document with id 'j34VSHYBtWLU2PFlNMg0'. Preview of field's value: '{hostname=xxx, os={kernel=xxx, codename=xxx, name=xxx, family=redhat, version=6.9, platform=redhat}, containerized=false, ip=[xxx, xxx], name=filebeat-ESB-service, id=3be1ad0512b084e75683731c00000026, mac=[xxxxx], architecture=x86_64}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:1270"}}}}}

In my pipelines.yml there is this:

  if !([host]) {
    mutate { add_field => { "host" => "%{beat.hostname}" } }
  }

Removing this, error disappeared, anyway data from the 7.8 agent are still not coming

I tried all these solutions:

  1. Added to my pipeline (in filter section):
      mutate {
       remove_field => [ "[host]" ]
      }

Nothing changed

  1. Tried, always in pipeline's filter section, to convert host in string (because on filebeat's index pattern host is configured as a string)
     mutate {
	convert => {"host" => "string"} 
     }
 Nothing changed
  1. In the output section i set this configuration for filebeat:
     manage_template => false
     index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"

I obtained other errors.

  1. I asked to systemists to set this on filebeat.yml
       processors:
          - drop_fields:
              fields: ["host"]

But i feel like nothing will change also in this case

Could you help me in solving this?

Thanks a lot
S.

Hi,

Can you provide your mapping configuration for the concerned index ?

Do you see any issue with the current type of the field and the type of the log incoming ?
Did you change anything related to this field on filebeat ( appart from what you try to solve the issue ) ? and what are the errors you get while trying to delete the field on filebeat ?

Hi @grumo35,

here you can find how the field is configured on the index-pattern (i don't know if it cover you first question):

I have noticed by the log that host incoming from the beat is a keyword, while the field is set as a string on index-pattern, this is why i tried to mutate it in a string; on filebeat i changed nothing about this. When i delete the part in the pipeline i don't get errors anymore, but neither documents from that filebeat are written into the index

could you try to delete the index and re create a new one with a new mapping ?

might be some issues related to the upgrades

may be i solved.... it was the index template inherited from the previous version that contained the wrong mapping.

I deleted it and documents from beats 7.8 started arriving

So you had a index tmeplate that was overriding with wrong data types or the data types changed between upgrades

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.