Hi,
I am trying to fetch data from elastic, add some fields and send it back to elastic on a new index.
I want to make logstash continuously listen to elastic input plugin.
I figured that's not possible.
Now I set a cron to run the logstash every minute.
But it sends all the data from the beginning to end.
Is there a way to make it remember the last message sent?
In File input plugin we have start_position, sincedb_path and ignore_older that we can set to do this.
Are there any options like these in Elasticsearch plugin, in order to make logstash remember the last message and push it to elastic only from the next?
our config looks like this:
input {
elasticsearch {
hosts => ["dxvrfpesrch001:9200"]
user => "user"
password => "password"
index => "tradetech-ctp"
}
}
filter {
json {
source => "message"
}
ruby {
code => "event.set('Latency',event.get('sourceTimestamp')-event.get('msgReceivedTimestamp'))"
}
ruby {
code => "event.set('unixdate',event.get('timeStamp').to_i / 1000000)"
}
date {
match => [ "unixdate", "UNIX_MS" ]
timezone => "UTC"
}
}
output {
elasticsearch {
hosts => ["dxvrfpesrch001:9200"]
user => "user"
password => "password"
index => "tradetech-ctp-seq"
}
stdout {
codec => rubydebug
}
}
Thank you