Logstash ECK Job

My ECK implementation uses logstash with a JDBC driver with over 100 Inputs pushing to a 1:1 index. For it's initial crawl of each index we're looking at performing the initial crawl via a logstash job. When the job completes our plan is to update the standard logstash long running resource. We've observed high CPU\RAM utilization when trying to crawl 100 databases all at once so moving towards a soft provisioning process would be better.
The logstash job needs to share the same data directory as the long running logstash resource so that the last update time and metadata persists.
Building out the Logstash Job resource we're running into some issues with accessing the Elastic license and having to pass all variables to the logstash job.
My question: Is this even doable if so is there an easy way to pass all of the logstash resources needed to the logstash job similar to how the ECK operator passes things to the logstash resource?
TIA