Our application keeps all its transactional data in an AWS RDS instance. I would like to provide business intelligence over this transactional data using Elasticsearch and Kibana. So my first task is to figure out a way to get this transactional data that is being stored in a MySQL RDS instance. From my research it seems like my best option is to setup Logstash with jdbc input plugin ti send data to Elasticsearch.
Our data is primarily sales transactions that are happening during the course of the day. What is the best strategy for sending this data to Elasticsearch. If I send all data at the end of the data then my business intelligence is always using old data. Ideally I would like to keep the data that Elasticsearch is using almost realtime.
Any advise on the strategy and setup would be helpful.
FYI.. We use AWS so Logstash, Elasticsearch and Kibana will all be in AWS.
Thanks for your response. Knowing the mechanics will definitely help. I was also asking for design related advise. Since my application is generating transactional data, it is happening throughout the day. My question was what is the best design to move transactional data to Elasticsearch.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.