Hi,
I am looking a solution for loading data in form of messages with Add, Update, Delete actions from Kafka stream to Elastic search, some of the options I have explored and found simple way is through Logstash but have some questions if eom experience member can answer..
How can we do add, update, delete action based on message parameter value.
Performance will be based on kafka stream throughput, or should be do some buffering to manage or some Batching option is available.
Any reference with some logstash config will be very helpful.
thanks
vipin
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.