Filebeat v7.3.2 - Failed to publish events caused by: EOF

Getting these about 20-30 minutes after starting filebeat and ongoing from then on:

Sep 20 15:42:39 silver.smartabase.com filebeat[7615]: WARN beater/filebeat.go:368 Filebeat is unable to load the Ingest Node pipeline...warning.
Sep 20 16:00:10 silver.smartabase.com filebeat[7615]: ERROR logstash/async.go:256 Failed to publish events caused by: EOF
Sep 20 16:00:10 silver.smartabase.com filebeat[7615]: ERROR logstash/async.go:256 Failed to publish events caused by: client is not connected
Sep 20 16:00:12 silver.smartabase.com filebeat[7615]: ERROR pipeline/output.go:121 Failed to publish events: client is not connected
Sep 20 16:20:10 silver.smartabase.com filebeat[7615]: ERROR logstash/async.go:256 Failed to publish events caused by: EOF
Sep 20 16:20:10 silver.smartabase.com filebeat[7615]: ERROR logstash/async.go:256 Failed to publish events caused by: client is not connected
Sep 20 16:20:11 silver.smartabase.com filebeat[7615]: ERROR pipeline/output.go:121 Failed to publish events: client is not connected
Sep 20 16:50:10 silver.smartabase.com filebeat[7615]: ERROR logstash/async.go:256 Failed to publish events caused by: EOF
Sep 20 16:50:10 silver.smartabase.com filebeat[7615]: ERROR logstash/async.go:256 Failed to publish events caused by: client is not connected
Sep 20 16:50:12 silver.smartabase.com filebeat[7615]: ERROR pipeline/output.go:121 Failed to publish events: client is not connected

Running the latest v7 on RHEL 7.7:
filebeat version 7.3.2 (amd64), libbeat 7.3.2 [5b046c5a97fe1e312f22d40a1f05365621aad621 built 2019-09-06 13:49:32 +0000 UTC]
Linux silver 3.10.0-1062.1.1.el7.x86_64 #1 SMP Tue Aug 13 18:39:59 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux

I've restarted the service a few times over 2 days, and the results are consistent...

Any help would be greatly appreciated... Thanks

With debugging enabled:

DEBUG [harvester] log/log.go:102 End of file reached: /var/log/cron; Backoff now.
DEBUG [registrar] registrar/registrar.go:404 Registry file updated. 4 states written.
DEBUG [registrar] registrar/registrar.go:356 Processing 1 events
DEBUG [registrar] registrar/registrar.go:326 Registrar state updates processed. Count: 1
DEBUG [registrar] registrar/registrar.go:411 Write registry file: /var/lib/filebeat/filebeat/data.json (4)
DEBUG [registrar] registrar/registrar.go:404 Registry file updated. 4 states written.
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/messages; Backoff now.
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/cron; Backoff now.
DEBUG [logstash] logstash/async.go:159 4 events out of 4 events sent to logstash host listener-eu.logz.io:5015. Continue sending
DEBUG [transport] transport/client.go:218 handle error: EOF
DEBUG [transport] transport/client.go:131 closing
ERROR logstash/async.go:256 Failed to publish events caused by: EOF
DEBUG [logstash] logstash/async.go:159 4 events out of 4 events sent to logstash host listener-eu.logz.io:5015. Continue sending
DEBUG [logstash] logstash/async.go:116 close connection
DEBUG [logstash] logstash/async.go:116 close connection
ERROR logstash/async.go:256 Failed to publish events caused by: client is not connected
ERROR pipeline/output.go:121 Failed to publish events: client is not connected
INFO pipeline/output.go:95 Connecting to backoff(async(tcp://listener-eu.logz.io:5015))
DEBUG [logstash] logstash/async.go:111 connect
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/messages; Backoff now.
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/cron; Backoff now.
INFO pipeline/output.go:105 Connection to backoff(async(tcp://listener-eu.logz.io:5015)) established
DEBUG [logstash] logstash/async.go:159 4 events out of 4 events sent to logstash host listener-eu.logz.io:5015. Continue sending
DEBUG [publisher] memqueue/ackloop.go:160 ackloop: receive ack [10: 0, 4]
DEBUG [publisher] memqueue/eventloop.go:535 broker ACK events: count=1, start-seq=6, end-seq=6

DEBUG [publisher] memqueue/eventloop.go:535 broker ACK events: count=3, start-seq=111, end-seq=113

DEBUG [publisher] memqueue/ackloop.go:128 ackloop: return ack to broker loop:4
DEBUG [publisher] memqueue/ackloop.go:131 ackloop: done send ack
DEBUG [acker] beater/acker.go:64 stateful ack {"count": 4}
DEBUG [registrar] registrar/registrar.go:356 Processing 4 events
DEBUG [registrar] registrar/registrar.go:326 Registrar state updates processed. Count: 4
DEBUG [registrar] registrar/registrar.go:411 Write registry file: /var/lib/filebeat/filebeat/data.json (4)
DEBUG [registrar] registrar/registrar.go:404 Registry file updated. 4 states written.
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/messages; Backoff now.
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/cron; Backoff now.

One more example:

DEBUG [harvester] log/log.go:102 End of file reached: /var/log/messages; Backoff now.
DEBUG [registrar] registrar/registrar.go:404 Registry file updated. 4 states written.
DEBUG [registrar] registrar/registrar.go:356 Processing 1 events
DEBUG [registrar] registrar/registrar.go:326 Registrar state updates processed. Count: 1
DEBUG [registrar] registrar/registrar.go:411 Write registry file: /var/lib/filebeat/filebeat/data.json (4)
DEBUG [registrar] registrar/registrar.go:404 Registry file updated. 4 states written.
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/cron; Backoff now.
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/messages; Backoff now.
DEBUG [logstash] logstash/async.go:159 4 events out of 4 events sent to logstash host listener-eu.logz.io:5015. Continue sending
DEBUG [transport] transport/client.go:218 handle error: EOF
DEBUG [transport] transport/client.go:131 closing
ERROR logstash/async.go:256 Failed to publish events caused by: EOF
DEBUG [logstash] logstash/async.go:159 4 events out of 4 events sent to logstash host listener-eu.logz.io:5015. Continue sending
DEBUG [logstash] logstash/async.go:116 close connection
DEBUG [logstash] logstash/async.go:116 close connection
ERROR logstash/async.go:256 Failed to publish events caused by: client is not connected
ERROR pipeline/output.go:121 Failed to publish events: client is not connected
INFO pipeline/output.go:95 Connecting to backoff(async(tcp://listener-eu.logz.io:5015))
DEBUG [logstash] logstash/async.go:111 connect
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/cron; Backoff now.
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/messages; Backoff now.
INFO pipeline/output.go:105 Connection to backoff(async(tcp://listener-eu.logz.io:5015)) established
DEBUG [logstash] logstash/async.go:159 4 events out of 4 events sent to logstash host listener-eu.logz.io:5015. Continue sending
DEBUG [publisher] memqueue/ackloop.go:160 ackloop: receive ack [20: 0, 4]
DEBUG [publisher] memqueue/eventloop.go:535 broker ACK events: count=3, start-seq=164, end-seq=166

DEBUG [publisher] memqueue/eventloop.go:535 broker ACK events: count=1, start-seq=14, end-seq=14

DEBUG [publisher] memqueue/ackloop.go:128 ackloop: return ack to broker loop:4
DEBUG [publisher] memqueue/ackloop.go:131 ackloop: done send ack
DEBUG [acker] beater/acker.go:64 stateful ack {"count": 4}
DEBUG [registrar] registrar/registrar.go:356 Processing 4 events
DEBUG [registrar] registrar/registrar.go:326 Registrar state updates processed. Count: 4
DEBUG [registrar] registrar/registrar.go:411 Write registry file: /var/lib/filebeat/filebeat/data.json (4)
DEBUG [registrar] registrar/registrar.go:404 Registry file updated. 4 states written.
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/cron; Backoff now.
DEBUG [harvester] log/log.go:102 End of file reached: /var/log/messages; Backoff now.

It seems to be issue with connection to Logstash.

Please provide the config files for both Filebeat and Logstash

Thank you.
Logstash is managed by logz.io, an external provider.
And the logs are arriving fine at the destination, so it's not a connection issue.
The errors however persist, despite the connection working fine.
Also, a "sister" VM, set up at the same time mechanically via Ansible, with just a few different packages installed given its different usage case, does NOT exhibit this behaviour, which is quite puzzling... filebeat.yml is identical on both VMs...

Can you also check the metrics being written by FIlebeat periodically

If enabled, Filebeat periodically logs its internal metrics that have changed
 in the last period. For each metric that changed, the delta from the value at
the beginning of the period is logged. Also, the total values for
 all non-zero internal metrics are logged on shutdown. The default is true.

logging.metrics.enabled: true

The period after which to log the internal metrics. The default is 30s.

logging.metrics.period: 30s

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.