How to fix host issue in filebeat-7.17.4 without drop_fields

Hi Team,

Stuck in one issue if I drop_fields of host then it will work otherwise not working

Working condition code:

  - drop_fields:
      fields: ["beat", "offset", "source", "type", "input_type", "host"]
      ignore_missing: true

Not Working condition code:

processors:
  - drop_fields:
      fields: ["beat", "offset", "source", "type", "input_type"]
      ignore_missing: true

Error is here :=>

Could not index event to Elasticsearch.status: 400, action:
"type" => "mapper_parsing_exception", "reason" => "failed to parse field [host] of type [text] in document with id ''. Preview of field's value: '{name=hostname}'", "caused_by" => {
				"type" => "illegal_state_exception", "reason" => "Can't get text on a START_OBJECT at 1:468"

Please help on this

I'm not getting any response back from Elasticsearch community also if someone worked before on filebeat configuration please let me know

Can you check the <indexname>/_mapping/host endpoint in Elasticsearch and see what it gives back? It looks like something has gone wrong there.

Here are few more problems in Filebeat:

I'm using output.kafka: and it never allowed me to send custom host fields inside filebeat.inputs: due to this I'm blocked to pass host name.

Second problem, I can ignore host problem but I can't run two YML file in one beat and it send through this ERROR

Exiting: data path already locked by another beat. Please make sure that multiple beats are not sharing the same data path (path.data).

To overcome this problem, I tired to merge two YML file into one using filebeat.config.inputs: but it won't work in my case.
below is the module details

filebeat.config.inputs:
  enabled: true
  path: inputs.d/*.yml

Any suggestion

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.