MongoDB to Elasticsearch synchronisation using Logstash JDBC Input plugin

I have defined several pipelines (each pipeline represents one to one relation with mongodb collection to elasticsearch index i.e. each mongodb collection will have its own index in elasticsearch).

all my pipelines look like the following (the only change is collection and index name). When I run the pipelines, I only get 1 document in their respective indices. What I want is to have each document from each mongodb collection to exist in elasticsearch's index as a separate document, keeping the document (_id) same as mongodb's document:

input {
  jdbc {
  jdbc_driver_library => "${MONGO_DB_DRIVER_PATH}"
  jdbc_driver_class => "${MONGO_DB_DRIVER_CLASS}"
  jdbc_connection_string => "${MONGO_DB_URL}"
  jdbc_user => "${MONGO_DB_USER}"
  jdbc_password => "${MONGO_DB_PASS}"
  schedule => "${MONGO_DB_SCHEDULE}"
  statement => "db.activityFeed.find({},{'_id': false});"
  record_last_run => true
  last_run_metadata_path => "${PLACEHOLDER_DB_DIR}${PLACEHOLDER_DB_NAME}"
}
}

filter {
  mutate {
  copy => { "_id" => "[@metadata][_id]"}
  remove_field => ["_id"]
}
}

output {
  elasticsearch {
      index => "activity_feed"
      hosts => "${ELASTIC_SEARCH_HOST}"
      doc_as_upsert => true
      document_id => "%{[@metadata][_id]}"
    }
    stdout { codec => rubydebug }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.