How to remove fields not required while sending log data from file beat

Sir,

It worked when given under the top-level in the configuration, as per the indentation you have suggested.

processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~
  - drop_fields:
      fields: ["agent", "ecs", "host", "log"] 

For example specifying host removed all of

"host": {
    "containerized": false,
    "ip": [
      "192.168.1.5",
      "fe80::cd8f:a19c:a6b2:2628"
    ],

As per the filebeat document

Where are processors valid?

At the top-level in the configuration. The processor is applied to all data collected by Filebeat.

Under a specific input. The processor is applied to the data collected for that input.

But when I configured under a specific input, it was giving errors or not working ..
But the removal of fields need to be applied to all input files. So it is ok in my case when added under the top-level in the configuration

But another observation was that what ever field I removed using filebeat , was the extra ones added by filebeat itself, eg host. But if a field with same name appear in the log file also , it will not get removed . That was what I required.Is it because filebeat considers each log row as a single entity?

I removed fields from log file which was not required , at ingest node pipeline like

{
      "remove": {
        "field": "kvmsg"
      }
    },

Is it a correct method for removing fields not required whether added by filebeat or present in log files that is to be indexed?
Thanks and regards
shini