Efficiency (Processing Time) of Filebeat and Logstash

I have couple of questions regarding Filebeat and Logstash.

  1. How can we find the exact time Filebeat takes to process (send logs from Logfile to Logstash) a certain set of logs?

  2. How can we find the time which Logstash take to send those logs to Elasticsearch?

  3. How can be find out the number of records present at any time in persistant queue for Logstash?

Hi!

Regarding question 1 there is a log of Filebeat printed every 30s that can give you the overview of how many events/logs have been shipped to the ouputs. An example:

2020-07-16T11:09:08.667+0300	INFO	[monitoring]	log/log.go:145	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":88,"time":{"ms":6}},"total":{"ticks":359,"time":{"ms":52},"value":359},"user":{"ticks":271,"time":{"ms":46}}},"info":{"ephemeral_id":"0e4cacc1-264f-445f-a893-2403f151b170","uptime":{"ms":150035}},"memstats":{"gc_next":48353472,"memory_alloc":24227664,"memory_total":65300608,"rss":827392},"runtime":{"goroutines":43}},"filebeat":{"harvester":{"files":{"2872277f-453a-4b46-9416-a2e696cef4dd":{"size":290}},"open_files":1,"running":1}},"libbeat":{"config":{"module":{"running":1}},"pipeline":{"clients":2,"events":{"active":4117,"retry":50}}},"registrar":{"states":{"current":2}},"system":{"load":{"1":1.8828,"15":2.2056,"5":2.0996,"norm":{"1":0.1177,"15":0.1378,"5":0.1312}}}}}}

With this looking at the events field you can take away the Filebeat's performance.

About the 2 other questions I think you should redirect them to the Logstash forum so as to have an answer from Logstash folks.

C.

Can't we just find out the total time filebeat took for processing certain logs.

For example, I have 100 logs files in a directory and each log file contains 10000 log lines (Assume that no new file will be added or deleted or modified). I started filebeat giving path to the log directory.

At the end, when processing of all files is completed, I want the number of files processed, number of log lines processed, number of successfully sent log lines, number of failed log lines, and the total time taken to process all the log files.

It would be better if we can see these statistics even when filebeat is processing log files.

I guess there is not such detailed information.

Feel free to open an enhancement Github issue explaining the use case and the importance of it.

C.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.