I have worked out everything on my local machine, parsed my logs in logstash, stored them in kibana, and built dashboards for my data on Kibana.
Now I trying to mock a production scenario where logstash is installed on one server and elasticsearch on another.
Currently, the only thing I did was specifying in logstash the host to be the IP address of the elasticsearch host:
and in elasticsearch.yml on the elasticsearch server, change network.host to the IP address of the logstash server.
And I dont think it works, can some one tell me what other settings I need to change or point me to some guides regarding setting this up? Thanks a million in advance
and in elasticsearch.yml on the elasticsearch server, change network.host to the IP address of the logstash server.
No, don't do that. The network.* settings must point to addresses held by interfaces by the machine running Elasticsearch. You don't need to do anything special to have Logstash connect to ES. Just make sure Logstash can connect to port 9300 on the ES machine (or port 9200 if you're using HTTP, which I'd recommend anyway) and that they have the same cluster name configured.
Should I have elasticsearch and logstash on different machines when deploying the ELK stack in production (as I've read that somewhere in the internet)? or can they be installed on the same server with the following logstash output:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.