How to make an logstash pipeline to run contuously (Elasticsearch input plugin)

Hey Folks,

I'm running an elastic pipeline to read from Elasticsearch and write documents to an S3 bucket

input {
  elasticsearch {
    ssl => true
    hosts => ["<Redacted>"]
    user => "<Redacted>"
    password => "<Redacted>"
    index => "<Redacted>"
    query => '{ "query": { "query_string": { "query": "<Redacted>*" } } }'
output {
  s3 {
    aws_credentials_file => "/etc/path/toFile/credentials.yml"
    region => "<Redacted>"
    bucket => "<Redacted>"
    additional_settings => {
      force_path_style => true
      follow_redirects => false
    prefix => "%<Redacted>"

When I run this, it process all the documents in the index(index alias), and then it get terminated.
I want to make sure this runs continuously, so that all the new docs that go into the index get re-written to s3. How can I achieve this?


You can use the schedule option to keep running the query. However, it cannot keep track of state, so it will fetch the complete result set again and send it to the output.

@Badger is there any better way to do this.
I mean to ingest the events to S3 continuously for new events

I do not know of one.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.