I have a Elasticsearch 6.2 cluster (3 master, 6 data nodes) up and running. I'd like to send data from my S3 bucket to elasticsearch so that I can view them on Kibana dashboards. I know I can do this via logstash or AWS Lambda. I'm trying with logstash. Is there any documentation to achieve this (sending data from s3 to ElasticSearch (lambda/logstash))?
I came to know that I can install logstash on different server than that of Elasticsearch (correct me if I'm wrong). When I tried to install logstash on CentOS machine, by entering 6.x in /etc/yum.repos.d/logstash.repo , logstash 6.4.2 is being installed. If I change it to 6.2, it is throwing me 404 not found error. Can you please help me with this?
The default distribution for most of our stack products changed with the 6.3 release of the stack. When trying to reference the differences between, say, 6.2 and 6.5, you absolutely need to check the documentation for the specific version rather than just changing the URL / address.
In terms of documentation for using Logstash to read from S3, the answer is absolutely yes:
All I did was search for logstash s3 input and that was what popped up. Logstash uses plugins as inputs, filters, and outputs. To read from something, you use the input, which is why you'd want the S3 input.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.