Pipeline terminated {"pipeline.id"=>".monitoring-logstash"} after processing existing docs in the index

Hi! I'm running Elasticsearch and Logstash locally, trying to set up a pipeline that will process all ongoing events from Elasticsearch and write it to stdout (for POC). However after the dockerized logstash is up it takes all existing documents process it and then shutting down.
Is it possible to keep it alive to process all upcoming documents? I saw that the plugin supports the schedule. Is it basically long polling my index? Can it "catch" all the new documents instead? in a listener mode?
Here is my pipeline:

input {
  elasticsearch {
      hosts => ["docker.for.mac.localhost:9200"]
      index => "test_index"
  }
}
output {
   stdout { codec => rubydebug }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.