Logstash output flow stops to ES when using pipeline-to-pipeline isolator pattern and one output fails


(Wasserman) #1

I followed the Logstash Output Isolator Pattern to push data to multiple Elastic stacks.

Logstash 6.3.2 is pushing to ES 6.3.2 and ES 6.2.3. The ES 6.2.3 cluster is production and we want to add HA while migrating to ES 6.3.2. The final step will be to upgrade the original cluster to ES 6.3.2.

In the representative configuration below, site2 is the ES 6.3.2 cluster in the same geographical region as Logstash. The ES 6.2.3 cluster is site1. Site1 had a 30 minute network outage on two nodes. There were lots of Logstash errors during this time, but I believe it seemed to only be related to these two nodes. Site2 was fine, except it received no data during this time. It seems that the failing pipeline impacting the healthy one. I know this feature is beta, but this pipeline mechanism was used to help prevent this exact problem.

Am I using this isolator pattern properly? Should I make any adjustments to my config? Is this a bug?

Thanks!

pipeline.yaml:

-   pipeline.id: out_site2
    queue.type: persisted
    path.config: “/etc/logstash/conf.d.output/output_site2.conf”
    pipeline.batch.size: 500
-   pipeline.id: out_site1
    queue.type: persisted
    path.config: “/etc/logstash/conf.d.output/output_site1.conf”
    pipeline.batch.size: 500
-   pipeline.id: ingest
    queue.type: persisted
    path.config: “/etc/logstash/conf.d/*.conf”

Ingest pipeline output:

output {
  pipeline {
    send_to => [out_site1, out_site2]
  }
}

/etc/logstash/conf.d.output/output_site2.conf:

input {
  pipeline {
    address => out_site2
  }
}

output {
  elasticsearch {
     …
     hosts => [“es1b:9201”, “es2b:9201”, “es3b:9201”]
     …
  }
}

/etc/logstash/conf.d.output/output_site1.conf:

input {
  pipeline {
    address => out_site1
  }
}

output {
  elasticsearch {
     …
     hosts => [“es1:9201”, “es2:9201”, “es3:9201”]
     …
  }
}

(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.