Bulk Size error on filebeat

Hi there
I'm seeing this on filebeat log file that I found just by accident while debuggin an issue.

2017-05-28T18:35:26Z INFO Error publishing events (retrying): 413 Request Entity Too Large

In the configuration file there is a line that says no to modify:

# The internal queue size for bulk events in the processing pipeline.
# Do not modify this value.
#bulk_queue_size: 0

How can I get rid or those errors? and I guess that I'm loosing data because if this, so how can I fix this?

filebeat-5.4.0-1.x86_64

Thanks in adavance
Regards

Hello,

I presume you are using Elasticsearch output. This error means that filebeat is sending a bulk request with a body that is too big for Elasticsearch to handle (i.e. you have long log messages). You could try setting a lower bulk_max_size in the output configuration (default is 50)

Cheers

Hi thiago, yes, the output is ES, I'll try to change that value to 100 maybe?, my concern was about the comment
that says NOT to change that value , and do not know why.

Thanks
Regards

The setting that I am referring to is bulk_max_size which is different from bulk_queue_size (which has the warning). Also you should lower it and not increase, because the issue is related to how much data filebeat is trying to push into ES.

Ok, thank you buddy.!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.