Hi,
When I implement ELK for a large scale application, Do I need to include the Beats to take data from RDBS and pass through Kafka, Then logstash will come to act?
I want to know what are the components needs to be used, Since want to integrate with a J2Ee application which currently uses an RDBMS.
None of the Beats managed by Elastic support JDBC, but Logstash has a jdbc input.
So you can use Logstash with the JDBC input to pull from the DB, and then use the Kafka output to write the data to Kafka. In the middle you can use filters if you need to transform the data at all.
I have one more query. Planning to use ELK for open text searching. Since Kafka is for stream processing, Does it required for offline data? I mean If the log stash .conf will be scheduled at certain interval.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.