Error parsing JSON doc

Testing for the first time loading a json document with logstash. I am have logstash 1.5.1 and elastic 1.6.0. I get the error below. Any help is much appreciated.

Thanks in advance for taking a look.

json document:
"SenderID": "",
"ReceiverID": "",
"DocTypeID": "MasterBillOfLading",
"DocCount": 1,
"Date": {
"type": "generation",
"Text": "01/26/2015 15:28"

config file:
input {
path => ["c:/test/mbol-header.json"]
start_position => "beginning"
type => "extract"

source => "message"

host => "localhost"
index => "dls"
workers => 1
document_type => "mbol"

Error message (snippet):
{:timestamp=>"2015-09-18T15:51:13.505000-0500", :message=>"Trouble parsing json", :source=>"message", :raw=>"{\r", :exception=>#<LogStash::Json::ParserError: Unexpected end-of-input: expected close marker for OBJECT (from [Source: [B@4250bacb; line: 1, column: 0])
at [Source: [B@4250bacb; line: 2, column: 3]>, :level=>:warn}
{:timestamp=>"2015-09-18T15:51:13.511000-0500", :message=>"Trouble parsing json", :source=>"message", :raw=>"\r", :exception=>#<LogStash::Json::ParserError: No content to map due to end-of-input
at [Source: [B@312ae1ce; line: 1, column: 1]>, :level=>:warn}

Generally json is on a single like, if your message is like that then you'll probably need to use the multiline filter as well.

Isn't the json codec supposed to support grabbing whole JSON documents from a stream? And json_lines assumes one document per line?

But yes, with a json filter you need to use a multiline filter or codec to join the lines of the document to a single message.

Yep, good point.