Logstash es_bulk codec only processing last event

Trying send bulk index request through a logstash pipeline using the http input with es_bulk codec. Having an issue where only the last event in the payload is being passed through. Sending the same payload to elasticsearch directly works fine. Tried

Sample payload stored in data.json:
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{ "field1" : "value1" }
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "2" } }
{ "field1" : "value2" }

Sample logstash config:
input {
http {
port => 8080
codec => es_bulk
filter {}
output {
file {
path => "/tmp/logstash.txt"

curl -i -H "Accept: application/json" -H "Content-Type:application/json" -X POST --data-binary "@data.json" localhost:8080/_bulk

Do you have \n on the end of the lines?

Yes, I'm using the data-binary flag with a text file so the new lines are preserved (From https://www.elastic.co/guide/en/elasticsearch/reference/2.3/docs-bulk.html: "If you’re providing text file input to curl, you must use the --data-binary flag instead of plain -d. The latter doesn’t preserve newlines. Example:"). Also, the command works fine going straight to elastic and I've captured the raw payload and everything looks right. Just not sure what is missing with the logstash listener.