I need send different logs to different output destination, but why filebeat don't suport?

@argb What kind of outputs do you want to use at the same time? Redis/ES/Kafka?

Concerning the fact that Filebeat only supports a single output we do understand that this limitation is problematic for some use cases. We currently choose to only support one output for multiples reasons.

Beats is meant to be a lightweight shipper to get logs off your machine as fast as possible, sending to a single output help us with that goal.

In the current design when Filebeat starts, it will create a single pipeline where all events from your different inputs will be sent. All the events are sent to a unique output, because of this we can maximize the batch size of events send to the remote system. Most of the time larger batches results in better throughput.

Having a single output help us with our delivery guarantee. When the output stops accepting messages, we just stop reading the files until the output is back again. The files on disk acts as a queuing system.

When you add multiples outputs to the mix, it adds a lot of complexity to the logic. A few examples:

  • Are you sending each event to multiples output?
  • If a single output is down, what should we do, halt everything or send to responsible server?
  • Can conditionals be used to send some events to a single output, what are the delivery guarantee of the outputs?

These are just a few things that explain why beats don't go there by design.
This is the current situation today, we might add support for multiple outputs someday, but we need to have a good story for a few problems that can happen.

In fact, Logstash and Beats can work together to achieve your goal; you can install Filebeat to your server and send events to one or multiple Logstash instances and use Logstash with multiple outputs (or multiple pipelines) to route the events to the different system.