Elasticsearch output splits Json content in to 2

Hi,
I'm working with Logstash and Elasticserch.
I'm trying to insert a Json in to an index in Elasticsearch.
my output configuration is :
elasticsearch {
action => "index"
hosts => "54.215.249.17:9200"
index => "tenant-1"
document_type => "builds"
document_id => "build-1"
workers => 1
}
I added a file output as well.
In the file I wrote to I can see the Json I passed as I passed it.
But when it send an Http call to Elasticsearch I get a 400 Error responce:
error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Malformed action/metadata line [3], expected START_OBJECT or END_OBJECT but found [VALUE_STRING]"}],"type":"illegal_argument_exception","reason":"Malformed action/metadata line [3], expected START_OBJECT or END_OBJECT but found [VALUE_STRING]"},"status":400} {:class=>"Elasticsearch::Transport::Transport::Errors::BadRequest", :backtrace=>["/home/ubuntu/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/base.rb:146:in __raise_transport_error'", "/home/ubuntu/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/base.rb:256:inperform_request'", "/home/ubuntu/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:54:in perform_request'", "/home/ubuntu/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/client.rb:125:inperform_request'", "/home/ubuntu/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.15/lib/elasticsearch/api/actions/bulk.rb:87:in bulk'", "/home/ubuntu/logstash-2.3.0/vendor/local_gems/378aba9b/logstash-output-elasticsearch-2.6.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:innon_threadsafe_bulk'", "/home/ubuntu/logstash-2.3.0/vendor/local_gems/378aba9b/logstash-output-elasticsearch-2.6.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "org/jruby/ext/thread/Mutex.java:149:insynchronize'", "/home/ubuntu/logstash-2.3.0/vendor/local_gems/378aba9b/logstash-output-elasticsearch-2.6.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "/home/ubuntu/logstash-2.3.0/vendor/local_gems/378aba9b/logstash-output-elasticsearch-2.6.0-java/lib/logstash/outputs/elasticsearch/common.rb:163:insafe_bulk'", "/home/ubuntu/logstash-2.3.0/vendor/local_gems/378aba9b/logstash-output-elasticsearch-2.6.0-java/lib/logstash/outputs/elasticsearch/common.rb:101:in submit'", "/home/ubuntu/logstash-2.3.0/vendor/local_gems/378aba9b/logstash-output-elasticsearch-2.6.0-java/lib/logstash/outputs/elasticsearch/common.rb:86:inretrying_submit'", "/home/ubuntu/logstash-2.3.0/vendor/local_gems/378aba9b/logstash-output-elasticsearch-2.6.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in multi_receive'", "org/jruby/RubyArray.java:1653:ineach_slice'", "/home/ubuntu/logstash-2.3.0/vendor/local_gems/378aba9b/logstash-output-elasticsearch-2.6.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in multi_receive'", "/home/ubuntu/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/output_delegator.rb:130:inworker_multi_receive'", "/home/ubuntu/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/output_delegator.rb:114:in `multi_receive'", "/home/ubuntu/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:305:
(removed some of the trace, the msg was too long )

I added a tcpdump on port 9200 , got that the Json was split in 2. (a nested part was extracted):
The original Json is :
{
"data-raw": {
"data": {
"result": "SUCCESS",
"id": "2016-01-01_01-01-01",
"buildNum":820
}
}
}
And what I get form the tcpdump is:
POST /_bulk HTTP/1.1
Connection: Keep-Alive
Content-Length: 157
Content-Type: text/plain; charset=ISO-8859-1
Host: 54.153.44.253:9200
User-Agent: Manticore 0.5.5
Accept-Encoding: gzip,deflate

{"index":{"_id":null,"_index":"tenant-1","_type":"builds","_routing":null}}
{"data-raw":{}}
{"result":"SUCCESS","buildNum":820,"id":"2016-01-01_01-01-01"}

you can see that the Json was split in to 2 parts, and It looks to me like this is the cuase of the problem, but I can't find the solution.
In addition, if I remove the outer Json (the data-raw property):
{ "data": { "result": "SUCCESS", "id": "2016-01-01_01-01-01", "buildNum":820 } }}
The error does't reoccur.
Does anybody have any idea what is the problem and how to fix it?
thanks.
Boaz.