Is Bulk forecasting available in Machine learning module

Hi Team,

Kindly let me know, if I can get the bulk forecast of metric beat parameters like CPU, Memory and Disk using Machine learning module of Kibana.

I am able to see only search based forecast.

Kibana version: 6.1.2
Elastic version: 6.1.2
Logstash Version: 6.1.2

Shridhar S Malagi


Yes, you should be able to run a forecast on metricbeat parameters like CPU with the ML plugin in Kibana. Forecasting has been available on multi metric jobs with 'split by' fields, as well as for Single Metric jobs, since 6.1.

Do you see the 'Forecast' button available when viewing the results of your jobs in the Single Metric Viewer? He is an example from a multi metric job I created, looking at CPU Utilization for example:


Hi Peter,

Thanks for your Reply. I am able to see forecast option in ML job created by me for CPU Utilization Metric.

My Requirement is to have bulk forecast of the CPU or RAM utilization value for multiple hostnames and send it over mail for the respective stake holders for future capacity planning.

Currently I am able to see Host selection based forecast and unable to get the option for bulk forecast across the hosts/servers.

Also help me to understand where can I get "confidence value" for the selected forecast. Currently able to see the confidence model in the form of Yellow Colored area
(but no values like 95% confident)

Attaching the sample screenshot.

Shridhar S Malagi


Just to clarify your requirements, do you want a forecast for CPU / RAM utilization per host (which is what you have already looking at the screenshot), or 1 forecast for all hosts combined? If you are interested in a forecast for CPU / RAM across all hosts, you would need to create a different job without the split on host/server.

The bounds in the forecast data, shown in the yellow shaded area, use a confidence interval of 95%. This value is fixed in our analytics and cannot currently be altered.

Note that all our results are stored in an elasticsearch index, so you could search against this directly if you wanted for example to create a dashboard to use in a report.


1 Like

Hi I would like to run 1 forecast for all the hosts combined. But forecasted Utilization value should be grouped by host-names.

If I have 20 Servers, I will run 1 forecast and result should be "Predicted Utilization values for all the individual hosts" segregated/grouped by hostnames.

Host1 Avg Utilization 89, predicted utilization for next one month : 98
Host2 Avg Utilization 98, predicted utilization for next one month : 119
Host3 Avg Utilization 85, predicted utilization for next one month : 102

Also thanks for sharing the information regarding confidence level .

Shridhar - if you have a multi-metric job (split per host), then invoking a forecast automatically runs predictions across all hosts. For example, here's a job that does an analysis per country code (a total of approximately 200 unique country codes):

Once the job is run and then you run the forecast - You can see that individual country codes have their own forecast results. For example, for country code=US:

versus the forecast for country code=VN:

Hi richcollier,

It is near to my use case & expectation. But challenge in your example is , I need to give host-names(country code) in search box and then manually forecast it.

How can I schedule forecast for all the hosts and get the forecasted report in bulk.



I know it seems like you are only invoking forecasting on one series when you are in the Kibana UI, but in actuality, the forecast job will bulk forecast for all of the individual time series that are present in the ML job. In fact, there isn't a way to only forecast on one series in a multi-metric ML job.

You can also prove it to yourself by looking at the _forecast API endpoint:

There is no way to specify only one time series. The Kibana ML UI just simply calls this endpoint.

In other words, you are already bulk forecasting.

Hi richcollier,
Thanks for giving all these information..

It really helped me....

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.