Can anyone provide a reference on how to add ElasticSearch data nodes to an existing cluster using either of the following:
Ambari
via command line (CLI)
FYI all nodes are already configured for the cluster and ElasticSearch was recently installed on several nodes that did not have it. I don't know if the master node automatically recognizes the nodes upon a restart of the service or if it is more extensive. Perhaps Apache Metron has a unique configuration.
However it should be as simple as starting another Elasticsearch process on another node with the config pointing to the rest of the cluster. I'm not sure why metron would be doing something special?
Ambari is HDP's manager. Apache Metron is packaged with Hortonworks Data Platform. Within Ambari you can see the number of ElasticSearch Datanodes configured but there is no equivalent feature for adding a ElasticSearch Datanode like you can with a generic datanode for a typical Hadoop cluster.
Therefore my assumption is you must have to tweak some configuration file and perhaps restart the ElasticSearch service for it to recognize the node? This is what I am trying to get an answer for.
I thought about that. I was under the assumption that since this was an ElasticSearch configuration question that this would be the better forum. All of ELK is packaged and deployed with Metron but if you think I should start with the Metron forum then I will.
We aren't in control of how metron deploys this on top of hadoop, so while we can tell you how to add a node the standard way, it sounds like something custom is happening?
Also FYI we’ve renamed ELK to the Elastic Stack, otherwise Beats and APM feel left out!
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.