How to handle logstash failures

I am using following logstash configuration to push log file content in to elastic search.

input { file { path => "/home/testuser/apps/logs/sample-collector-app.log" start_position => "beginning" type => "log" } } output { elasticsearch { hosts => "localhost" index => "testlogs" } }

but i want to know how to control logstash in case of failure (i.e logstash goes down), then how to make sure that next time i start logstash it won't push events that already pushed,

my requirements is logstash should not push duplicates entries to elastic search.

for example : there are 10 events in log file and logstash push 5 events and some how it went down , next time when i start logstash it should push from 6 the event, not from beginning.

Logstash's file input does this automatically. Its documentation describes in reasonable detail how it works.