Reading new data from elastic using logstash to rabbitMQ

Hi, I want to send every new data entered to index in elastic to a RabbitMQ queue every second.
I tried using logstash for it but for some reason it send all the data and not just the new one.
I saw you can use time stamp and then use it to only get the new data but I didn't understand how to make it so every new document automatically add that field when added to the index.

here is the logstash conf :

input {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "test_index"
    docinfo => true
    codec => json
    schedule => "* * * * *"

output {
  rabbitmq {
    host => "localhost"
    port => 5762
    exchange => "test"
    exchange_type => "direct"
    key => ""
    durable => true
    persistent => true
    user => "guest"
    password => "guest"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.