I am not sure since when this is happening, but my filebeat has stopped collecting the logs of elastic pods that I have deployed using eck.
In total I do have these 3 pods:
elastic-es-masterdata-100-0
elastic-es-masterdata-100-1
elastic-es-masterdata-100-2
kubectl get logs for elastic-es-masterdata-100-0
{"type": "server", "timestamp": "2020-12-28T09:55:14,039Z", "level": "INFO", "component": "o.e.x.s.a.TokenService", "cluster.name": "elastic", "node.name": "elastic-es-masterdata-100-0", "message": "refresh keys" }
{"type": "server", "timestamp": "2020-12-28T09:55:14,304Z", "level": "INFO", "component": "o.e.x.s.a.TokenService", "cluster.name": "elastic", "node.name": "elastic-es-masterdata-100-0", "message": "refreshed keys" }
{"type": "server", "timestamp": "2020-12-28T09:55:14,341Z", "level": "INFO", "component": "o.e.l.LicenseService", "cluster.name": "elastic", "node.name": "elastic-es-masterdata-100-0", "message": "license [05fd3886-5ac3-4f63-af8d-98ed7adf3828] mode [basic] - valid" }
{"type": "server", "timestamp": "2020-12-28T09:55:14,342Z", "level": "INFO", "component": "o.e.x.s.s.SecurityStatusChangeListener", "cluster.name": "elastic", "node.name": "elastic-es-masterdata-100-0", "message": "Active license is now [BASIC]; Security is enabled" }
{"type": "server", "timestamp": "2020-12-28T09:55:14,351Z", "level": "INFO", "component": "o.e.h.AbstractHttpServerTransport", "cluster.name": "elastic", "node.name": "elastic-es-masterdata-100-0", "message": "publish_address {10.101.8.112:9200}, bound_addresses {0.0.0.0:9200}", "cluster.uuid": "vJDbksYdQoqxG-J2XIxFCg", "node.id": "bmuLp9vgRISUXQ92x3SHrg" }
{"type": "server", "timestamp": "2020-12-28T09:55:14,351Z", "level": "INFO", "component": "o.e.n.Node", "cluster.name": "elastic", "node.name": "elastic-es-masterdata-100-0", "message": "started", "cluster.uuid": "vJDbksYdQoqxG-J2XIxFCg", "node.id": "bmuLp9vgRISUXQ92x3SHrg" }
Shows a bunch of logs. So everything is fine there.
However, on the filebeat pod that is running on the same node and therefore should be responsible for collecting the related logs I get the following message:
2020-12-28T09:55:00.108Z INFO log/harvester.go:302 Harvester started for file: /var/log/containers/elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log
2020-12-28T09:55:00.108Z INFO log/harvester.go:302 Harvester started for file: /var/log/containers/elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log
2020-12-28T09:55:00.108Z INFO log/harvester.go:302 Harvester started for file: /var/log/containers/elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log
2020-12-28T09:55:00.108Z INFO log/harvester.go:302 Harvester started for file: /var/log/containers/elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log
2020-12-28T09:55:00.108Z INFO log/harvester.go:302 Harvester started for file: /var/log/containers/elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log
2020-12-28T09:55:15.185Z ERROR fileset/factory.go:103 Error creating input: Can only start an input when all related states are finished: {Id: native::50472987-66305, Finished: false, Fileinfo: &{elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log 0 416 {980206655 63744746098 0x64d0ce0} {66305 50472987 1 33184 0 0 0 0 0 4096 0 {1609149298 980206655} {1609149298 980206655} {1609149298 980206655} [0 0 0]}}, Source: /var/log/containers/elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log, Offset: 29213, Timestamp: 2020-12-28 09:55:14.754030991 +0000 UTC m=+418.853525927, TTL: -1ns, Type: container, Meta: map[], FileStateOS: 50472987-66305}
}', won't start runner: Can only start an input when all related states are finished: {Id: native::50472987-66305, Finished: false, Fileinfo: &{elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log 0 416 {980206655 63744746098 0x64d0ce0} {66305 50472987 1 33184 0 0 0 0 0 4096 0 {1609149298 980206655} {1609149298 980206655} {1609149298 980206655} [0 0 0]}}, Source: /var/log/containers/elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log, Offset: 29213, Timestamp: 2020-12-28 09:55:14.754030991 +0000 UTC m=+418.853525927, TTL: -1ns, Type: container, Meta: map[], FileStateOS: 50472987-66305}
2020-12-28T09:55:15.193Z ERROR fileset/factory.go:103 Error creating input: Can only start an input when all related states are finished: {Id: native::50472987-66305, Finished: false, Fileinfo: &{elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log 0 416 {980206655 63744746098 0x64d0ce0} {66305 50472987 1 33184 0 0 0 0 0 4096 0 {1609149298 980206655} {1609149298 980206655} {1609149298 980206655} [0 0 0]}}, Source: /var/log/containers/elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log, Offset: 29213, Timestamp: 2020-12-28 09:55:14.754030991 +0000 UTC m=+418.853525927, TTL: -1ns, Type: container, Meta: map[], FileStateOS: 50472987-66305}
}', won't start runner: Can only start an input when all related states are finished: {Id: native::50472987-66305, Finished: false, Fileinfo: &{elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log 0 416 {980206655 63744746098 0x64d0ce0} {66305 50472987 1 33184 0 0 0 0 0 4096 0 {1609149298 980206655} {1609149298 980206655} {1609149298 980206655} [0 0 0]}}, Source: /var/log/containers/elastic-es-masterdata-100-0_elastic-system_elasticsearch-e7afb8b67f88b2a036ffe91b427b6dd1c0dd3a0d0589372dba510ede9f2aa74d.log, Offset: 29213, Timestamp: 2020-12-28 09:55:14.754030991 +0000 UTC m=+418.853525927, TTL: -1ns, Type: container, Meta: map[], FileStateOS: 50472987-66305}
Someone having an idea?
Log collection for all other pods is working fine
Below my filebeat config
filebeat.yml: |-
output.elasticsearch.hosts: ['https://${ELASTICSEARCH_HOST:elasticsearch}:${ELASTICSEARCH_PORT:9200}']
output.elasticsearch.protocol: "https"
output.elasticsearch.ssl.verification_mode: "none"
output.elasticsearch.username: ${ELASTICSEARCH_USERNAME}
output.elasticsearch.password: "${ELASTICSEARCH_PASSWORD}"
output.elasticsearch.index: "%{[kubernetes.namespace]}-filebeat-%{+xxxx.ww}"
setup.template.enabled: true
setup.template.name: "filebeat-%{[agent.version]}"
setup.template.pattern: "*-filebeat-*"
setup.template.order: 150
setup.template.overwrite: true
setup.ilm.enabled: false
filebeat.autodiscover.providers:
- type: kubernetes
node: ${NODE_NAME}
hints.enabled: true
hints.default_config:
type: container
paths: ["/var/log/containers/*-${data.kubernetes.container.id}.log"]
multiline.pattern: '^[[:space:]]'
multiline.negate: false
multiline.match: after
exclude_lines: ["^\\s+[\\-`('.|_]"] # drop asciiart lines
processors:
- add_host_metadata:
netinfo.enabled: false
- add_cloud_metadata:
- add_kubernetes_metadata:
host: ${NODE_NAME}
matchers:
- logs_path:
logs_path: "/var/log/containers/"
- decode_json_fields:
fields: ["message"]
process_array: true
max_depth: 1
target: ""
overwrite_keys: true
add_error_key: true
- drop_event: #namespaces to be excluded from logging
when.or:
- equals.kubernetes.namespace: "test"
- equals.kubernetes.namespace: "default"