As you can see currently any new reporting jobs will wait until the previous one is completed, which makes things quite slow. I've also tried posting jobs from different elastic users, thinking there's a separate job queue for each user. However the queue seems to be global for all users, therefore if I added any jobs as a separate user to the same server as above, it will have to wait until all 7 jobs created by user 'reporting' have finished before starting to process.
Any ideas on how to make this asynchronous? Or at least a workaround to speed this up?
There is no way to run multiple reporting jobs at the same time. This is done in order to minimize the impact of reporting on the Kibana server. You could create multiple Kibana instances and this way each instance will have it's own reporting queue, iirc.
Also, you can schedule the reporting jobs using watcher, this way you can run them over night and have them all available in the morning.
Thanks for the reply.
This is definitely a consideration however having multiple instances of Kibana will be quite heavy. My container instance while idling uses 264MiB of memory, and almost 1GB of disk space.
Is there any way to isolate the reporting queue to run as its own containerised service?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.