When events created I send it to kafka topic.
now I want to consume that same topic then write it to both of elasticsearch and hdfs directory
for elasticsearsh it simply can be done with logstash ( kafka input, es output ).
This case kafka msg should json formatted.
for hdfs. I have no good idea. and it is my question.
basically I want to use kafka hdfs connector.
but it only supports avro. and logstash not support avro stream.
what about logstash hdfs output ? is that enterprise ready?
events volume is not small , so avro is acceptable and json seems not adoptable. because of file size and network load.
what is your idea for this situation?