I would like to move from logstash as a log shipper, to filebeat.
I'm using the logstash file input plugin to collect logs, same thing with filebeat, to send everything to a centralized logstash-shipper before writing to elasticsearch.
The thing is, if I shutdown logstash and start a fresh filebeat instance instead, filebeat will start from the beginning of the file, leading to duplicate logs in Elasticsearch.
I could have add a "log content hash based" on logstash-shipper side in elasticsearch document_id to avoid duplicates, but I have to admit I'd like an easier solution.
Would you have any idea on how to "bootstrap" a filebeat instance with logstash file cursor maybe ?