How to controlling the log flow

Hi Team,

During performance testing, we receive many log files at a moment. We are facing logs lagging in our environment. So, that it takes too much time to reach the normal flow. So, our concern is, we have to control the log flows from filebeat or logstash in any kind of way, while the application team is doing the performance testing !

Suppose, if the application team pushes the 1k of logs while doing the performance testing. In the meantime, kibana needs to be trying to receive the whole 1k of logs. But it doesn't receive the whole logs due to not being able to handle that much load at a time.

So, In that case we need to get the logs like 10mins some count of logs and another 10 mins some count of logs and this is the way we would like to control the logs while doing performance testing.

The above given log document sizes are just for example only.

This is the log flow - Filebeat/Metricbeat - AWS Kafka - Logstash - ES - Kibana to visualise the logs.

We would like to know whether we could control the log flow. Could you please tell us.
thanks, balaji

hello,
Kibana is not really "receiving" logs, it's just displays what's available in Elasticseach. Normally when you have a big batch of ingested logs, the bottleneck is Elasticsearch processing of the documents, so you will have to start from there and improve the ingest rate.
There are some ways here: Tune for indexing speed | Elasticsearch Guide [8.3] | Elastic but for more detailed information I suggest the Elasticsearch section on this forum.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.