Logstash failover s3

i was planning to pull elb logs from s3 using logstash. i want to use multiple logstash for failover control. is there anything i can use multiple logstash to pull logs from s3 bucket with loadbalance or something else can configure.

This is tricky to do without a single point of failure since Logstash's s3 input doesn't support sharing state with multiple Logstash instances.

What, exactly, do you want to accomplish? What's the scenario?

i was planning to send elb logs to s3 and there to elasticsearch using logstash.
My concern was about if logstash node failed or something happen to log at that time i was looking for highavailability like using 2 logstash nodes if node1 fails we have node 2

Okay. Well, as I said the s3 input can't share state between multiple Logstash instances so whatever you do you won't really get any help from Logstash. Since logs are naturally buffered in S3 I wouldn't worry too much about Logstash going down. Just prepare routines and/or scripts to sync state from one Logstash instance to another and be prepared to fire up a new instance that can start with the state of the old dead instance.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.