Consuming messages from Rabbit while output is not responsive


I've noticed that using RabbitMQ (with ACKing) as Input and Elasticsearch as Output in Logstash might fall into situation, when Output is not responsive from some reason (beeing restarted, connection loss), then Logstash still consumes messages from RabbitMQ making them lost.

Is there any possibility to prevent that situation?

My pipeline config:

input {
  rabbitmq {
    id => "rabbit01"
    host => "${RABBITMQ_HOST}"
    port => "${RABBITMQ_PORT:5672}"
    user => "${RABBITMQ_USER}"
    password => "${RABBITMQ_PASSWORD}"
    passive => true
    queue => "${RABBITMQ_QUEUE_NAME_SAVE}"

output {
  elasticsearch {
    hosts => ["${ELASTICSEARCH_URI}"]
    index => "my_index"
    document_id => "%{[gid]}"
    doc_as_upsert => true

Plugins lists:

bash-4.2$ bin/logstash-plugin list --verbose | grep -E 'elastic|rabbit'
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.headius.backport9.modules.Modules to method java.lang.Object.finalize()
WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
logstash-filter-elasticsearch (3.6.0)
logstash-input-elasticsearch (4.3.2)
logstash-input-rabbitmq (6.0.3)
logstash-output-elastic_app_search (1.0.0)
logstash-output-elasticsearch (10.1.0)
logstash-output-rabbitmq (5.1.1)

Thank you for responses

Persistent queues might help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.