How to Put Elasticsearch data to Oracle Table

Hi

Could you please help me with my requirement of putting Elasticsearch index data into oracle table.

Do we have any additional plugin required for logstash to achieve same. ?

I am aware of CSV method , but its too time taking activity and it requires many softwares to be installed on machine to achieve end-to-end results.

Thanks
Tushar Nemade

There is a jdbc output plugin written by a third party. It is not supported by elastic and the person who wrote it has not modified it in over a year, so it is not clear that they are supporting it either.

Hi

angryangel some name ... he has written it and not longer supports it as per new version of logstash / elasticsearch..

Anything else from logstash itself ... ?

Thanks
Tushar Nemade

Not that I know of.

Hi @tusharnemade - @Badger is correct. logstash-output-jdbc is not a supported plugin (c.f support matrix). It has not been updated in years and the entire repository has been archived which indicates the owner has no intention of doing anything with it.

Out of curiosity, why would you need to push Elasticsearch data into a database?

Hi

We have application metrics and logs data getting pushed into elasticsearch , from multiple apps.

We need to filter those and put them into oracle table , for business purpose were oracle comes into picture...

Thanks for the clarifications @tusharnemade - Logstash will not be able to push data into the Oracle database.

However, you could think of an architecture where your client applications are sending data to a middleware (e.g Kafka). Both Elasticsearch and Oracle could then consume the same data from Kafka. For example:

This is just a suggestion - there could be other approaches - that might require more architectural thinking to address this use case :slight_smile:

1 Like

Appreciated your response @ropc.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.