Is Beats mandatory to use, while using log stash JDBC plug in?

Hi,
When I implement ELK for a large scale application, Do I need to include the Beats to take data from RDBS and pass through Kafka, Then logstash will come to act?
I want to know what are the components needs to be used, Since want to integrate with a J2Ee application which currently uses an RDBMS.

Thanks,
Preethi

None of the Beats managed by Elastic support JDBC, but Logstash has a jdbc input.

So you can use Logstash with the JDBC input to pull from the DB, and then use the Kafka output to write the data to Kafka. In the middle you can use filters if you need to transform the data at all.

No Beats required for this.

Thanks for the reply....

I have one more query. Planning to use ELK for open text searching. Since Kafka is for stream processing, Does it required for offline data? I mean If the log stash .conf will be scheduled at certain interval.

Thanks

I don't understand the question. Perhaps you could re-phrase it?

  1. Kafka needs to be used for reducing the latency for loading offline data? Or no need to use? Is it useful for real time data streaming only?

  2. To load offline data from an RDBMS, need to schedule the log stash .conf file. Is it?

Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.