Hello guys, I am trying to import millions of rows from Postgres to elastic using Logstash and jdbc-input-plugin. But I have to pull data from 15+ tables so join query is not a good solution as it takes too long to process. I have tried using jdbc_streaming and splitting my join query to multiple stream block but no success. Any idea of how can I import rows performative? Any suggestion would be helpful.
Thanks in advance