Streaming transactional data from AWS RDS instance to Elasticsearch

Hello,

Our application keeps all its transactional data in an AWS RDS instance. I would like to provide business intelligence over this transactional data using Elasticsearch and Kibana. So my first task is to figure out a way to get this transactional data that is being stored in a MySQL RDS instance. From my research it seems like my best option is to setup Logstash with jdbc input plugin ti send data to Elasticsearch.

Our data is primarily sales transactions that are happening during the course of the day. What is the best strategy for sending this data to Elasticsearch. If I send all data at the end of the data then my business intelligence is always using old data. Ideally I would like to keep the data that Elasticsearch is using almost realtime.

Any advise on the strategy and setup would be helpful.

FYI.. We use AWS so Logstash, Elasticsearch and Kibana will all be in AWS.

Thanks,

Waqar

The JDBC input has scheduling options, so you can run it every minute (for eg) - https://www.elastic.co/guide/en/logstash/6.0/plugins-inputs-jdbc.html#_scheduling

Hi Mark,

Thanks for your response. Knowing the mechanics will definitely help. I was also asking for design related advise. Since my application is generating transactional data, it is happening throughout the day. My question was what is the best design to move transactional data to Elasticsearch.

Thanks,

Waqar

I don't see any reasons from the Logstash perspective why you shouldn't be running the query frequently (using the schedule feature as described).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.