Hi Team,
We are testing filebeat for our development setup, using filebeat 5.6.2. The logs are sent to logstash hosted by logz.io via SSL, but we observed that we are getting some duplicate logs entries in elasticsearch index, for around same time we also get following error in Filebeat logs:
2018-02-09T03:13:42Z ERR Failed to publish events caused by: EOF
2018-02-09T03:13:42Z INFO Error publishing events (retrying): EOF
2018-02-09T04:13:52Z ERR Failed to publish events caused by: EOF
2018-02-09T04:13:52Z INFO Error publishing events (retrying): EOF
2018-02-09T05:13:42Z ERR Failed to publish events caused by: EOF
2018-02-09T05:13:42Z INFO Error publishing events (retrying): EOF
2018-02-09T06:14:02Z ERR Failed to publish events caused by: EOF
2018-02-09T06:14:02Z INFO Error publishing events (retrying): EOF
2018-02-09T07:13:43Z ERR Failed to publish events caused by: EOF
2018-02-09T07:13:43Z INFO Error publishing events (retrying): EOF
2018-02-09T08:13:43Z ERR Failed to publish events caused by: EOF
2018-02-09T08:13:43Z INFO Error publishing events (retrying): EOF
2018-02-09T09:13:43Z ERR Failed to publish events caused by: EOF
2018-02-09T09:13:43Z INFO Error publishing events (retrying): EOF
We have tried following to solve this issue :
- Checked if the log entry was generated twice, we didn't find any duplicate entries in source log
- Checked for any network related issue causing drop of acknowledgement from logstash and filebeat resending logs.
- As suggested on logstash community we checked for client_inactivity_timeout value of logstash - it's 5 minutes. So doesn't seem that logstash is closing the tcp connection after inactivity.
- logz.io team has checked if they are getting any errors in logstash, but they aren't getting any related to connection timeout.
Can you please help in resolving this issue ?
Thanks