Currently, i am using filebeat and metricbeat to get logs from kubernetes into the elastic cloud. I wish to test this using logstash now. Can somebody guide me how to pull the data? I am aware of setting up the pipeline config in cloud console. But would like to know if I need to change the output setting such as, filebeat -> logstash -> cloud in config file.
Alternatively, is there any way to use logstash alone to get input from kubernetes directly?
I would like to find the best way to log the data hence testing all the available features.
I assume that in pipeline section provided in cloud console, we should enter the input and output required. Then from terminal execute the pipeline using the following command,
bin/logstash -f ###.conf
But not sure how to redirect filebeat to use logstash and how to execute it from there
Trying to get input from file filebeat. wanted to confirm if the host address provided in the input is the elastic search address? and in the output part of the index is the generalized format correct? or should we specify the absolute index name?
Also, the pipleline is saved in cloud. So how to proceed it from there?
The host in the input section is the host that has filebeat installed on it to ship logs. You need to make sure filebeat is pointed at your logstash server and logstash is listening for that server.
@mitchdavis Ok in that case, can I install logstash in elastic cloud and point out the elastic search host address? and since I configure the pipeline there how can I stash my first event?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.