Filebeat outputs such error ERROR pipeline/output.go:92 Failed to publish events: temporary bulk send failure
filebeat version 6.2.2 (amd64), libbeat 6.2.2
The error occurs on log files with long messages (tens lines). On other files filebeat works good.
I've tried to play with option bulk_max_size. But even setting it to bulk_max_size:1 and reading just one log file have not received the problem.
There is no sense to view elasticsearch logs - errors occur on Filebeat.
As I mentioned above, only log files with long messages are the cause of such errors. And Filebeat cannot handle them correctly.
I would have definitively expected some errors on the Elasticsearch side as it means one or more items from the bulk request were reject by Elasticsearch.
Can you share your full filebeat config file and run filebeat with debug log level enabled. This should give us some more insights on what exactly happens when ES rejects part of the bulk request.
There was not problem in files with long log messages. I could handle such files with Filebeat. In my case such log files had big size - about 18MB. After I cleared these files, Filebeat could handle them correctly.
So, dear developers, take attention at this bug.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.