I want to import from my existing log file as a new file input like so to back-fill my old data:
input {
path => [ '/path_local_to_es/file']
start_position => 'beginning'
}
AND
Continue to capture the tail of the very same file as it's actively logging through a redis messaging queue...
On Data Source:
input {
path => [ '/path_local_to_source/file' ]
start_position => 'end'
}
filter {
...
}
output {
redis {
host => 'queue-to-elasticsearch'
data_type => 'list'
key => 'logstash'
}
}
On Elasticsearch
input {
redis {
host => '127.0.0.1'
data_type => 'list'
key => 'logstash'
}
}
output {
elasticsearch {
host => '127.0.0.1'
}
}
How do I combine the two in a way that lets me back-fill all of the data I want from the file but not duplicate data that's already being ingested by the redis queue? I'm ok with stopping the queue for a moment but even then, how would I prevent missing data between when my local copy of the log file reaches EOF and the redis queue is turned back on?
Thanks.