Elasticsearch to Oracle Table

Hello Team

I have a requirement where in I have to move selected data from Elasticsearch INDEX to oracle Table {RDBMS} , in real-time.

Could you please help me in suggesting the method / examples / links / blogs / open-source tools using which I could fulfil my requirement.


I'm not aware of any magic tool that would do this in real time, on the fly.

I'd probably try to do that from the application which is actually writing to Elasticsearch. Is that an application you wrote? Something else?

Hi David

I have not written the application code which sends data to elasticsearch. I know its comes via fluentd to elasticsearch.

This Elasticsearch has our critical data which need to be filtered and only portion of that data is needed in oracle for further Analysis.

We are expecting GB's of data daily basis in Elasticsearch. This DB's of data cannot be inserted into Oracle database directly because of our infra setup / capability to handle soo much inserts is also doubtful in oracle ... Elasticsearch is fast , as our data does not involve any transactions hence we have Elasticsearch for our application logging.

Any methodology you can suggest to be used to achieve ES - to - Oracle Table , in real time or say delay of 1 hour

You could use an ingest pipeline linked to the indices you want to extract data from to add a timestamp indicating when the data was indexed. You can then write a script/application to periodically extract data for a specific time period and index this into Oracle.

Why do you need Oracle for? What is the use case that Elasticsearch can not cover?

Okay. How to put data into oracle from that injest pipeline , injest pipeline has output directed to oracle directly

No, you can not write to Oracle from an ingest pipeline. You need a separate application that periodically queries and exports data.

Oracle has our major data of application based on OLTP.

Elasticsearch has our logging data from application / middleware servers ...

what ever logging information success / failure we have is in ES. This data we have to use with user information and map some reports for business purpose. Hence we have this architecture in place.

.Okay .. That will require an separate efforts from development team..

Thank You for your response.

Why not doing that on elasticsearch side?

You can enrich every log with the user information and store that in elasticsearch.
Then do your analysis in Kibana. In real time. Whatever the size of the data is and without needing to read again tons of data and write them elsewhere.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.