Filebeat push event in very slow rate

Hi all
I have a problems with one of my filebeat installed server.
That server is config to get f5 log. then i install filebeat to get log from that server to Elasticsearch.
But back before that setup working fine and the log rate are pushed at a normal rate but now with the new server the rate of pushing is like weirdly push on elastic, about every 10-15 minute it push a chunk of log with few thousands event to elastic.
I have tried everything i can things of to fix the issue but nothing works, i have increased the ram of the filebeat server, increased the worker of logstash, increased internal queue of filebeat.

Please help!

Thanks for your time.

Hi @lusynda :slightly_smiling_face:

I'm gonna start with the obvious: there's something different on your new server so I'll start looking there.

From your description, I can only assume that your new server is sending too few logs and filebeat doesn't send the messages until the queue is totally filled.

Can you describe the rate and load of your events? How many events per second? How big each event is in KB?

Check also that you have activated the expected log level in your new server. It might be the case that you were logging DEBUG logs before and now your are using INFO :sweat_smile:

Thanks for your replies.

The log on the new server cannot send too few logs since i view the log file directly and there is a lots of new log data there.

Currently, about 10-15 minute the log got ingest for about 4000 event each time. I'm not sure about the size of the event but back on the old server it got ingest without any problems.

The log i've been getting are log from f5 waf that we got from rsyslog server.

Try removing the folder with the data files to see if that helps Directory layout | Filebeat Reference [7.16] | Elastic

But if you are using the same filebeat, the same Elasticsearch and the only thing that changed was your sever, I strongly suggest to look into the new server or your new network topology, firewall or permissions there for potential issues.

Is there a possibility that due to the size of each event is too large for filebeat to process it.
i have just check the log, each event is about 4kb each.
Is that the problems because i thinks that is a bit too large and there are about 1000000 event each day.

Somehow i fix the problems,
I change the queue type on the filebeat from mem to disk and then everything works fine.
i don't really know how and why it works but at least it works.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.