I am considering setting a Kafka setup in front my LogStash instances , so I tried enabling this on a 5.0 platform running on windows in a proof of concept setup.
I am using filebeat and winlogbeat to send logs .
Enabling Kafka output in filebeat and winlogbeat was pretty straightforward.
Specifying Kafka input in logstash made me wonder , if I have to convert the input from Kafka to json myself with the filter/json specs ? Or is there a smarter way to deserialize from kafka into json format ?
There doesnt seem to be a lot of examples with this. Nor documentation or guidelines. Anyone using this as I intend ?