I saw today that on v5 you implemented kafka as output which is great!
My question is if I'll be able to route the logs to different topics with beats according fields in the log?
This is currently not planned. An important point to note is that filebeat does not process the log lines, so is not aware of the content. If you need that, it is recommended to use LS for that, do the processing there and then send it to Kafka.
Did you have in mind that a single log file would be sent to different topics or is the topic per log file (or multiple log files per topic). The reason I ask is that you could use the fields in the prospector to define the kafka topic. Then for LS the routing would be much easier.
Note: Please check if LS supports different Kafka topics.
Thanks @ruflin, Im using docker files which sends the logs through STDOUT or tcp\udp connection and I'm not planning to use files
In kafka output there is an option to have the topic either globally per beat are based on the document type. For filebeata configure document_type
in the prospector and set use_type: true
.