Best Practices in logstash

Hi All,

My current setup is as below,

Amazon S3 -> Logstash -> Elasticsearch -> Kibana

In logstash, we are trying to poll files from 4 s3 buckets ie each for each environment. so we are expecting data load per day of 3 to 4GB Max all together of 4 buckets.

So my question is which is the best way of configuration,

  1. Creating a logstash conf file where in input section i poll all 4 s3 buckets eg as below,

    input {
    s3{ bucket 1 }
    s3{ bucket 2 }
    s3{ bucket 3 }
    s3{ bucket 4 }
    }
    filter { same condition }
    output {same ES }

I will be creating index on month basics. How good the performance it will be in such case ?

  1. Do i need to create 4 config files for each 1 bucket and also create a pipeline as like below. How the performance will be in both case. For this condition i will be using 4 indexes for each environment.

    • pipeline.id: Prod
      path.config: logstash_prod.conf
      queue.type: persisted
    • pipeline.id: TEST
      path.config: logstash_test.conf
      queue.type: persisted
    • pipeline.id: UAT
      path.config: logstash_uat.conf
      queue.type: persisted
    • pipeline.id: Dev
      path.config: logstash_dev.conf
      queue.type: persisted

Please suggest which is best practices.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.