I am using filebeat to read log files and output them to kafka.My log file contains lines like this:
:2018/06/28 10:59:37 {"data":"test message","topic":"account_4"}
and filebeat read it like this:
what i want to do is that read this topic input nested in message field and set it to kafka output in filebeat.yml i.e output.kafka which is not working for me.
Dynamic topic selection is supported (docs), but the field you use to specifiy the topic need to be in the event (i.e. you need to have the data in a structured format first).
So before you can reference topic you need to parse it out of the message field. Up until Filebeat 6.3.0 this would not have been possible unless your data was JSON. But now there is a dissect processor.
You'll can try using the dissect processor followed the decode_json_fields processor. You need to separate that leading timestamp in the message field from the JSON content. Then parse the JSON. And finally you'll be able to reference the topic value contained in your message.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.