Can I write grok expression to enrich log files in FileBeat before sending to Logstash / elastic search

Hi

My use case is to ship log files from various applications to elastic search so that I can view them from kibana.

I wanted to know can filebeat be configured for grok expression so that application team can manage their log parsing at their end and central logging system / deployment is unaffected. If it can be then the need to Logstash is questionable.

-Rohit

You can add fields with static values but grok-style extraction of fields isn't supported in Filebeat.

Thanks @magnusbaeck

Can you tell me one more thing, when logstash and filebeat output plugin can be configured to use max_retries to -1, then there isn't need for queue. can you explain me why queue is needed when load increases more than elasticsearch capacity? I am referring https://www.elastic.co/guide/en/logstash/current/deploying-and-scaling.html

Thanks
Rohit

Having a queue isn't strictly needed, but it helps get the data off of the shippers as quickly as possible and is a good way of scaling the Logstash instances that feeds ES (useful if Logstash rather than ES is the bottleneck of the system).