Hey all,
I am currently making a POC with Metricbeat. My goal is to scrape Pormetheus metrics and send them over to the ELK Stack.
Currently I am running into an Error, which I can not solve, namely "Unable to decode response from prometheus endpoint"
Basically I have a Prometheus Server in the Cloud. I try to access the Prometheus metrics via Metricbeat, which is running local in a docker container
( docker run --net=host -v ~/Metricbeat/metricbeat.yml:/usr/share/metricbeat/metricbeat.yml docker.elastic.co/beats/metricbeat:7.2.0)
with a custom YML which sends the data then to elasticsearch and Kibana.
I have set up a local mock server which exports the same data like the Prometheus in the cloud, with whom i have the issues with.
To check, wehter there is an problem with the structure of the data, i'll make a curl on the data out of the docker container, wehre Metricbeat is running. First on the cloud Prometheus, then on the local Prometheus. To dont run into this issue of to much data (https://github.com/elastic/beats/issues/11912) i have add a querry to get only few results
Like you see, the structure is exactly the same.
Also when I point Metricbeat to the local Prometheus, It works just fine. Like you see here in elasticsearch:
But still, when I point Metricbeat to the Prometheus in the cloud, It still gives the error instead of the metrics.
I only can think now of 3 problems
Bug in Metricbeat
Some sort of problem with sshuttle ( i need it to connect to the server. Also I can get the data when I curl the server from the Metricbeat Docker container so i don't think its a problem)
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.