Logstash forwarding events only on log file update

Hi
I am new to ELK. I am using logstash,Redis and Elasticsearch.When I push Events from logstash Shipper to Redis and then to Logstash Indexer and Query for events in Elasticsearch it shows Events lesser than the actual count.Later if I update the input log file,the actual Events are added with the Events already indexed resulting in duplicate Events.This Happens only for the first time the Index is created.
For Example If my Log file contains 1050 Events , Initally only 50 log events are indexed,later if I add 1 event to the log file ,I get 1101 Events.
Following are my shipper and indexer configrations
Shipper.conf:
input {
file {
path => "/root/ELK/Logs_sample/pps*"
start_position => "beginning"
type => "scheduling"
}
}

output {
redis { host => "10.197.10.10" data_type => "list" key => "logs_sample" }
}

Indexer.conf :
input {
redis {
host => "10.197.10.10"
data_type => "list"
key => "logs_sample"
}
}
output {
elasticsearch {
type =>"xxx"
host => "10.10.10.10"
port => "9200"
index => "abc"
protocol => "http"
}
}

Any Solution regarding this issue will be more Helpful.

How are you adding events to the logfile?

Please edit your post and move it to the Logstash group. It this point the question is unrelated to Elasticsearch.

We generate those events manually

What does that mean? Exactly how are you adding events to the file?

For Testing purpose we run a simulator script which updates the file manually.

You are still not answering my question! How does the simulator script update the file? echo "test log message" >> somefile.log? Something else?

we have a java executable jar to generate the events.We just run that and events gets inserted in the file.

Provided below some sample logs:

2015/10/26 22:51:35.42 INFO : [dbdf895a3879d5] Received HTTP GET request: http://10.56.194.79:6060/pps/households/hh11/catalog
2015/10/26 22:51:35.42 ERROR: [5034671b1eea09f7e3dbdf895a3879d5] Failed processing getCatalog