"exclude_lines" not working with docker input

Greetings,
My Filebeat logs are getting spammed and I find it impossible to exclude certain lines in my log (the reverse proxy ones).

EDIT: Im pretty sure that my ignore configurations are not valid, as I tried to test with ".*" and still logs got in my ES.

I have tried the following configs: (exclude_lines)

filebeat.inputs:
- type: docker
  combine_partial: true
  containers:
    path: "/var/lib/docker/containers"
    stream: all
    ids:
      - "*"
  exclude_lines: ['LOG: host_lookup_failed MAIN', 'no host name found for IP address', 'SMTP connection from']
  processors:
    - add_docker_metadata: ~

AND drop_event:

filebeat.inputs:
- type: docker
  combine_partial: true
  containers:
    path: "/var/lib/docker/containers"
    stream: all
    ids:
      - "*"
  processors:
    - add_docker_metadata: ~
    - drop_event:
        when:
          contains:
            message: ['LOG: host_lookup_failed MAIN', 'no host name found for IP address', 'SMTP connection from']

but the results staying the same:

Any suggestions?

Could you please share some debug logs of Filebeat when it is parsing incoming events which should be dropped? (./filebeat -e -d "*")

Hey thank you for your reply!
I am using filebeat as docker container as well. should i get inside the container to fetch those logs?

Anyway, if thats what you mean, those got up:

},
  "beat": {
    "version": "6.5.4",
    "name": "besmtp2",
    "hostname": "besmtp2"
  },
  "source": "/var/lib/docker/containers/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0-json.log",
  "message": " 4490   SMTP connection from [10.2.9.119] lost D=0s",
  "prospector": {
    "type": "docker"
  },
  "input": {
    "type": "docker"
  }
}
2019-11-19T10:41:17.786Z	DEBUG	[harvester]	log/log.go:102	End of file reached: /var/lib/docker/containers/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0-json.log; Backoff now.
2019-11-19T10:41:17.788Z	DEBUG	[publisher]	memqueue/ackloop.go:160	ackloop: receive ack [40: 0, 4]
2019-11-19T10:41:17.788Z	DEBUG	[publisher]	memqueue/eventloop.go:535	broker ACK events: count=4, start-seq=357, end-seq=360

2019-11-19T10:41:17.788Z	DEBUG	[publisher]	memqueue/ackloop.go:128	ackloop: return ack to broker loop:4
2019-11-19T10:41:17.788Z	DEBUG	[publisher]	memqueue/ackloop.go:131	ackloop:  done send ack
2019-11-19T10:41:17.789Z	DEBUG	[acker]	beater/acker.go:64	stateful ack	{"count": 4}
2019-11-19T10:41:17.789Z	DEBUG	[registrar]	registrar/registrar.go:345	Processing 4 events
2019-11-19T10:41:17.789Z	DEBUG	[registrar]	registrar/registrar.go:315	Registrar state updates processed. Count: 4
2019-11-19T10:41:17.789Z	DEBUG	[registrar]	registrar/registrar.go:400	Write registry file: /usr/share/filebeat/data/registry
2019-11-19T10:41:17.793Z	DEBUG	[registrar]	registrar/registrar.go:393	Registry file updated. 3 states written.
^C2019-11-19T10:41:18.105Z	DEBUG	[service]	service/service.go:50	Received sigterm/sigint, stopping
2019-11-19T10:41:18.106Z	INFO	beater/filebeat.go:449	Stopping filebeat
2019-11-19T10:41:18.106Z	INFO	[autodiscover]	cfgfile/list.go:118	Stopping 3 runners ...
2019-11-19T10:41:18.106Z	DEBUG	[autodiscover]	cfgfile/list.go:129	Stopping runner: input [type=docker, ID=9044332143424930136]
2019-11-19T10:41:18.106Z	INFO	input/input.go:149	input ticker stopped
2019-11-19T10:41:18.106Z	INFO	input/input.go:167	Stopping Input: 9044332143424930136
2019-11-19T10:41:18.106Z	DEBUG	[autodiscover]	cfgfile/list.go:129	Stopping runner: input [type=docker, ID=18244818695767697176]
2019-11-19T10:41:18.107Z	DEBUG	[autodiscover]	cfgfile/list.go:129	Stopping runner: input [type=docker, ID=8520339488523961116]
2019-11-19T10:41:18.108Z	INFO	input/input.go:149	input ticker stopped
2019-11-19T10:41:18.108Z	INFO	input/input.go:167	Stopping Input: 8520339488523961116
2019-11-19T10:41:18.107Z	INFO	log/harvester.go:275	Reader was closed: /var/lib/docker/containers/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0-json.log. Closing.
2019-11-19T10:41:18.107Z	INFO	input/input.go:149	input ticker stopped
2019-11-19T10:41:18.108Z	INFO	input/input.go:167	Stopping Input: 18244818695767697176
2019-11-19T10:41:18.108Z	DEBUG	[publish]	pipeline/client.go:148	client: closing acker
2019-11-19T10:41:18.109Z	DEBUG	[publish]	pipeline/client.go:150	client: done closing acker
2019-11-19T10:41:18.109Z	DEBUG	[publish]	pipeline/client.go:154	client: cancelled 0 events
2019-11-19T10:41:18.108Z	DEBUG	[publish]	pipeline/client.go:148	client: closing acker
2019-11-19T10:41:18.109Z	DEBUG	[publish]	pipeline/client.go:150	client: done closing acker
2019-11-19T10:41:18.109Z	DEBUG	[publish]	pipeline/client.go:154	client: cancelled 0 events
2019-11-19T10:41:18.108Z	DEBUG	[harvester]	log/harvester.go:510	Stopping harvester for file: /var/lib/docker/containers/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0-json.log
2019-11-19T10:41:18.110Z	DEBUG	[harvester]	log/harvester.go:520	Closing file: /var/lib/docker/containers/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0-json.log
2019-11-19T10:41:18.110Z	DEBUG	[harvester]	log/harvester.go:390	Update state: /var/lib/docker/containers/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0-json.log, offset: 4050931208
2019-11-19T10:41:18.110Z	DEBUG	[harvester]	log/harvester.go:531	harvester cleanup finished for file: /var/lib/docker/containers/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0/379da428bbfa7888d233dae319af7fbbfe188843879a003fe3d8a86b34782ae0-json.log
2019-11-19T10:41:18.111Z	DEBUG	[publish]	pipeline/client.go:148	client: closing acker
2019-11-19T10:41:18.111Z	DEBUG	[publish]	pipeline/client.go:150	client: done closing acker
2019-11-19T10:41:18.111Z	DEBUG	[publish]	pipeline/client.go:154	client: cancelled 0 events

Yes, that is what I meant. However, I am also interested in the log messages before the event is published. That is when Filebeat decides to filter out or forward an event. Could you please share a bit more of your logs?

Is that better?

Publish event: {
  "@timestamp": "2019-11-20T10:17:05.456Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.5.4"
  },
  "stream": "stderr",
  "message": "12270 LOG: smtp_connection MAIN",
  "docker": {
    "container": {
      "id": "fd5616f6c389df136cb1a5c623682fc1b8992ee2698fb7c28bb2dcad8345ced2",
      "name": "smtp-cluster_smtp_1",
      "image": "namshi/smtp:latest",
      "labels": {
        "com": {
          "docker": {
            "compose": {
              "oneoff": "False",
              "project": "smtp-cluster",
              "service": "smtp",
              "version": "1.24.0",
              "config-hash": "1fdbecb642a6add2409edb105bb3877103205ca57ea43e82a672ffb6c46a8ee0",
              "container-number": "1"
            }
          }
        }
      }
    }
  },
  "beat": {
    "version": "6.5.4",
    "name": "besmtp1",
    "hostname": "besmtp1"
  },
  "source": "/var/lib/docker/containers/fd5616f6c389df136cb1a5c623682fc1b8992ee2698fb7c28bb2dcad8345ced2/fd5616f6c389df136cb1a5c623682fc1b8992ee2698fb7c28bb2dcad8345ced2-json.log",
  "tags": [
    "Filebeat-smtp"
  ],
  "input": {
    "type": "docker"
  },
  "prospector": {
    "type": "docker"
  },
  "host": {
    "name": "besmtp1"
  },
  "offset": 4278353124
}
2019-11-20T10:17:06.053Z	DEBUG	[publish]	pipeline/processor.go:308	Publish event: {
  "@timestamp": "2019-11-20T10:17:05.456Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.5.4"
  },
  "offset": 4278353226,
  "message": "12270   SMTP connection from [10.2.8.114] lost D=0s",
  "source": "/var/lib/docker/containers/fd5616f6c389df136cb1a5c623682fc1b8992ee2698fb7c28bb2dcad8345ced2/fd5616f6c389df136cb1a5c623682fc1b8992ee2698fb7c28bb2dcad8345ced2-json.log",
  "prospector": {
    "type": "docker"
  },
  "input": {
    "type": "docker"
  },
  "stream": "stderr",
  "tags": [
    "Filebeat-smtp"
  ],
  "docker": {
    "container": {
      "labels": {
        "com": {
          "docker": {
            "compose": {
              "project": "smtp-cluster",
              "service": "smtp",
              "version": "1.24.0",
              "config-hash": "1fdbecb642a6add2409edb105bb3877103205ca57ea43e82a672ffb6c46a8ee0",
              "container-number": "1",
              "oneoff": "False"
            }
          }
        }
      },
      "id": "fd5616f6c389df136cb1a5c623682fc1b8992ee2698fb7c28bb2dcad8345ced2",
      "name": "smtp-cluster_smtp_1",
      "image": "namshi/smtp:latest"
    }
  },
  "beat": {
    "name": "besmtp1",
    "hostname": "besmtp1",
    "version": "6.5.4"
  },
  "host": {
    "name": "besmtp1"
  }
}
2019-11-20T10:17:06.049Z	DEBUG	[acker]	beater/acker.go:64	stateful ack	{"count": 20}
2019-11-20T10:17:06.052Z	DEBUG	[publish]	pipeline/processor.go:308	Publish event: {
  "@timestamp": "2019-11-20T10:17:05.456Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.5.4"
  },
  "prospector": {
    "type": "docker"
  },
  "beat": {
    "name": "besmtp1",
    "hostname": "besmtp1",
    "version": "6.5.4"
  },
  "host": {
    "name": "besmtp1"
  },
  "input": {
    "type": "docker"
  },
  "offset": 4278353002,
  "stream": "stderr",
  "message": "12270   no host name found for IP address 10.2.8.114",
  "tags": [
    "Filebeat-smtp"
  ],
  "docker": {
    "container": {
      "name": "smtp-cluster_smtp_1",
      "image": "namshi/smtp:latest",
      "labels": {
        "com": {
          "docker": {
            "compose": {
              "oneoff": "False",
              "project": "smtp-cluster",
              "service": "smtp",
              "version": "1.24.0",
              "config-hash": "1fdbecb642a6add2409edb105bb3877103205ca57ea43e82a672ffb6c46a8ee0",
              "container-number": "1"
            }
          }
        }
      },
      "id": "fd5616f6c389df136cb1a5c623682fc1b8992ee2698fb7c28bb2dcad8345ced2"
    }
  },
  "source": "/var/lib/docker/containers/fd5616f6c389df136cb1a5c623682fc1b8992ee2698fb7c28bb2dcad8345ced2/fd5616f6c389df136cb1a5c623682fc1b8992ee2698fb7c28bb2dcad8345ced2-json.log"
}

Any idea whats the problem?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.