Multiple Logstah indexer/filter hosts causing duplicate elasticsearch documents

The setup:
logstash-forwarder (v0.4.0) -> logstash shipper (v1.5.4) -> redis (v2.8.19) -> logstash indexer (v1.5.4) -> elasticsearch (1.7.5)

There's one logstash shipper
Original setup had one logstash indexer, but I recently started two more to deal with pursts.

With one logstash indexer things were fine, but we would have situations where a large volume of logs would back up in Redis and the logstash indexer could never catch up. Leaving me to have no choice but to do a redis-cli flushdb and save the raw logs for the day when I figure out the capacity issue.

Last week I stood up two more logstash indexers, with identical configurations to the first. Using the redis input and shipping to elasticsearch.

At first all seemed to be going well. it did not seem like I was getting any triplicate records.
Today though, I noticed the evidence of that some records are being triplicated, and others are seemingly being processed three times.

The evidence is that there will be a single document in ES but it will contain three values, comma separated, in each of the fields, as if it's been processed three times.
Such as:
received_from: devel, devel, devel

Here's the input config for the logstash indexers:
input { redis { host => "10.10.1.140" port => 6379 data_type => "list" key => "logstash" } }

So, what should I do to ensure that each indexer node does not process a record that others have already processed?

Thanks,
Terrac Skiens

Additional details, and and update

The records being triplicated were json format, and I was able to stop the triplicating by using the uuid tag in logstash. So, no new triplicates, but I've now got some clean up to do with aggregations.

The non json records were not triplicated, but have 3 items per field fit fields that were of type text, and processed with grok.

Any help or discussion on running multiple filter instances reading from a single redis host would be very welcome.

Thanks,
Terrac Skiens