I have two VMs. One of them is running elasticsearch and kibana, the other is running metricbeat. I am trying to get the 2nd VM to send its data to the 1st. I thought that the only configuration was to put the IP of the 1st machine in the metricbeat.yml file.
output.elasticsearch:
hosts:["1stVMIP:9200"]
This doesn't seem to be getting the data to elasticsearch though. Is there other configuration I need to do? Maybe something on the elasticsearch side to let it accept data from the 2nd VM?
After some time working on the network configuration between my two VMs, I can now get a response from curl, but the data from metricbeat is still not showing up in Kibana.
After a bit of additional debugging I have finally gotten it to work. I will provide a summary off what I did incase anyone else has this issue:
After configuring the Network between my VMs, ping would work, but curl would not. I found a source online here that said to do "iptables -F" which to my understanding disabled firewalls. At this point I could curl, but data was still not getting into elasticsearch. I checked the metricbeat logs and found a line saying "Ping request failed... server gave HTTP response to HTTPS client" so I re-commented the protocol setting in metricbeat.yml.
Also, at some point during my trouble shooting I changed network.host in the elasticsearch.yml file to the IP of the machine. When I tried resetting it to the default of localhost, kibana would say that login was disabled. I do not know why this is the case, but the system is now working with network.host set as the IP.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.