Filebeat get big log file

(Tat Dat Pham) #1

I'm using filebeat 1.2.1

I have one fie log iis (6Gb) and many file log (~2000 file/day, 10MB/file).
example : In this time, i check in kibana and i saw newest log of the 7 hours ago.
I'm using rabbitMQ, I checkd in RabbitMQ, don't have queue log.

Performance in Logstash instanse is great.

So how to config filebeat for get big log file?

This is my config : Filebat-config

Thank so much

(Steffen Siering) #2

How can you tell logstash performance is great?

There are a many factors regarding filebeat performance. Just sending files to /dev/null on physical machine I was able to process like 95k eps. filebeat throughput depends on disk IO (unless files still buffered by OS caches) and downstream performance. E.g. if sending directly to elasticsearch indexing performance in elasticsearch. If sending to logstash throughput depends on processing time within logstash + performance even more downstream. This is due the outputs generating back-pressure if they can not keep up slowing down event generation in filebeat (as we don't want to drop any events).

somewhat related:

The second link includes some tips to figure out filebeat throughput to /dev/null and to logstash.

(system) #3