I'm not understanding the inegration use case for logstash, weather do I should integrate it or not. I want to check some analytics on my transactions data, API requests, platform, as many as things possible to get the data.
I've already build my application using nodejs & mySql, now I want to elasticsearch & kibana for analytics purpose. So I want to ask how should I integrate,
Weather I should make REST API call to put my data into elastic search?
Or Weather I should sync mySQL Db & elasticsearch with some script.
For which part I should use only elasticsearch REST API, and for which part I should integrate logstash?
Can someone guide me or link me to a good article for the use cases.
But if you are ok with let say 5 minutes latency, then Logstash is good IMO.
Just create a logstash pipeline with an input jdbc plugin and an output elasticsearch plugin which runs every 5 minutes (jdbc plugin parameters) and you should be OK.
If you have specific questions about building that please ask in #logstash channel which is better for that.
Thank you David Pilato,
Yes, I do not need to have real-time query, I also do not want my application to slow while updating to elasticsearch for each bit.
I also do not want my application to slow while updating to elasticsearch for each bit
I don't think that will be the case IMO.
Specifically if you index asynchronously in elasticsearch and don't block any thread.
You can also write to something like Kafka then read Kafka with logstash.
Reading the database every 5 minutes will put some pressure on your database. Also if you model is complex (like split on multiple tables), it might be hard to write a single query to fetch your object.
On the other hand, when your application holds the full object in memory, it's efficient, immediate to serialize in JSON and send the object to elasticsearch.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.