Hello guys i'm trying to build a solution where i could monitor a group of virtual machines. i have installed metricbeat on these machines.
what i want to do is send these metrics to my ELK stack deployed on a kubernetes cluster.
the problem is i want each of my virtual machines to see their metrics only.
i want to provide for them a kibana dashboard specifically for each of the vm's metrics.
what i'm asking if this is possible, do i need to install kibana on each and every one of these machines, or is it possible through my kibana on kubernetes.
i have elk deployed on my kubernetes cluster.
what i want to do is to install beats on a group of virtual machines, and i want to provide for each vm his metrics only on a kibana dashboard. what i want to know is should i install kibana on each vm, meaning the beats send metrics to my Elasticsearch deployed on k8s and then send them back to kibana installed on the vm. this method i think is consuming resources, which may i ask is it possible to do it ?
the other method is each vm can access the kibana deployed on k8s through their credentials and can view their own metrics. every vm gets unique credentials and can view their own metrics. this means i don't install kibana on them. which got me questioning is it actually possible to do it ?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.