Architecture suggestion

I have a set of applications and have filebeat sending logs to logstash where they will be aggregated.

Is using logstash for aggregating logs of different format a good idea? As logs from multiple servers need to be aggregated, is it still possible to have more than one instance of logstash for horizontal scalability?

What are other options?

For horizontal scalability I would have filebeat send the data to Kafka (more complex but more powerful) or redis (super simple, but only a simple queue). All beats have these outputs. You can then have multiple instances of Logstash which pickup the data from the queue.

If I use Kafka for aggregating events from multiple filebeat, do I need to write custom consumers to parse the events or is there configuration driven plugins like grok etc for logstash?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.