Filebeat outputs such error
ERROR pipeline/output.go:92 Failed to publish events: temporary bulk send failure
filebeat version 6.2.2 (amd64), libbeat 6.2.2
The error occurs on log files with long messages (tens lines). On other files filebeat works good.
I've tried to play with option bulk_max_size. But even setting it to bulk_max_size:1 and reading just one log file have not received the problem.
Can you share your filebeat config?
I would recommend you to have a look at the elasticsearch logs, perhaps there you see more details about the error.
There is no sense to view elasticsearch logs - errors occur on Filebeat.
As I mentioned above, only log files with long messages are the cause of such errors. And Filebeat cannot handle them correctly.
Normally the bulk sense failure is caused by an error on the Elasticsearch side. Knowing which error elasticsearch returns could be helpful here.
Alas. There are no errors on Elasticsearch side.
I would have definitively expected some errors on the Elasticsearch side as it means one or more items from the bulk request were reject by Elasticsearch.
Can you share your full filebeat config file and run filebeat with debug log level enabled. This should give us some more insights on what exactly happens when ES rejects part of the bulk request.
There was not problem in files with long log messages. I could handle such files with Filebeat. In my case such log files had big size - about 18MB. After I cleared these files, Filebeat could handle them correctly.
So, dear developers, take attention at this bug.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.