How to remove fields not required while sending log data from file beat

While sending data from a logfile using file beat through ingest pipeline to index in Elasticsearch,
some additional fields not present in the concerned log file is also getting populated.

These data are mostly related to the host machine from which the log is send.
It is shown as given below

"ecs": {
    "version": "1.11.0"
  "host": {
    "containerized": false,
    "ip": [
    "mac": [
    "hostname": "shi-Latitude-3510",
    "architecture": "x86_64",
    "os": {
      "platform": "ubuntu",
      "version": "20.04.3 LTS (Focal Fossa)",
      "family": "debian",
      "name": "Ubuntu",
      "kernel": "5.11.0-40-generic",
      "codename": "focal",
      "type": "linux"
    "id": "3104710ca45f478eadec013d783b2b1c",
    "name": "shi-Latitude-3510"
  "agent": {
    "id": "058ff2af-04fe-470d-bbe9-9a87caad3719",
    "name": "shi-Latitude-3510",
    "type": "filebeat",
    "version": "7.15.0",
    "hostname": "shi-Latitude-3510",
    "ephemeral_id": "d56a8699-657a-4c34-96d3-0b379ced4e7b"
  "log": {
    "offset": 144730231,
    "file": {
      "path": "/home/shi/logfortifull/"

I need to remove them all.
For testing , I tried to remove one of them (ecs.version),using the filebeat configuration as follows

    #- /var/log/*.log
    #- c:\programdata\elasticsearch\logs\*
    - /home/shi/logfortifull/
  jason.keys_under_root: true
  jason.message_key: log
  encoding: utf-8
   type: "otherfgdrp-virs"
   - drop_fields:
       fields: ["ecs.version"] 

But it gives the following error , and is not removing the field. Field ecs.version is available in the index and is searchable from kibana also.

2021-11-30T05:05:36.395+0530	DEBUG	[processors]	processing/processors.go:128	Fail to apply processor client{drop_fields={"Fields":["ecs.version"],"IgnoreMissing":false}}: failed to drop field [ecs.version]: key not found
2021-11-30T05:05:36.395+0530	DEBUG	[processors]	processing/processors.go:203	Publish event: {

How to remove the fields ? Is there any way of removing them from filebeat stage itself using
host.* ,

thanks and rgards

Think it has to do with your structure. See the example.

You have a two spaces before processors. That should be at the beginning of the line.

   - drop_fields:
       fields: ["ecs.version"] 


It worked when given under the top-level in the configuration, as per the indentation you have suggested.

  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~
  - drop_fields:
      fields: ["agent", "ecs", "host", "log"] 

For example specifying host removed all of

"host": {
    "containerized": false,
    "ip": [

As per the filebeat document

Where are processors valid?

At the top-level in the configuration. The processor is applied to all data collected by Filebeat.

Under a specific input. The processor is applied to the data collected for that input.

But when I configured under a specific input, it was giving errors or not working ..
But the removal of fields need to be applied to all input files. So it is ok in my case when added under the top-level in the configuration

But another observation was that what ever field I removed using filebeat , was the extra ones added by filebeat itself, eg host. But if a field with same name appear in the log file also , it will not get removed . That was what I required.Is it because filebeat considers each log row as a single entity?

I removed fields from log file which was not required , at ingest node pipeline like

      "remove": {
        "field": "kvmsg"

Is it a correct method for removing fields not required whether added by filebeat or present in log files that is to be indexed?
Thanks and regards

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.