Here's the log from beats' publish:
{
"@timestamp": "2017-07-06T07:12:12.629Z",
"beat": {
"hostname": "test",
"name": "test",
"version": "5.4.3"
},
"input_type": "log",
"message": "2017-07-06 07:12:03.082 INFO 26110 --- [nio-8090-exec-2] c.s.b.m.controller.DownloadController : Downloading Company List_Draft_HQ_v89.1_OD.xlsx_output.csv",
"offset": 32218,
"source": "test.log",
"type": "log"
}
2017-07-06T07:12:12Z DBG Publish: {
"@timestamp": "2017-07-06T07:12:12.629Z",
"beat": {
"hostname": "test",
"name": "test",
"version": "5.4.3"
},
"input_type": "log",
"message": "2017-07-06 07:12:03.085 INFO 26110 --- [nio-8090-exec-2] c.s.b.m.controller.DownloadController : Sending output response..",
"offset": 32345,
"source": "test.log",
"type": "log"
}
However, on logstash, I receive this error:
[2017-07-06T07:12:13,806][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: 2017-07-06 07:12:03.082 INFO 26110 --- [nio-8090-exec-2] c.s.b.m.controller.DownloadController : Downloading Company List_Draft_HQ_v89.1_OD.xlsx_output.csv; line: 1, column: 6]>, :data=>"2017-07-06 07:12:03.082 INFO 26110 --- [nio-8090-exec-2] c.s.b.m.controller.DownloadController : Downloading Company List_Draft_HQ_v89.1_OD.xlsx_output.csv"}
[2017-07-06T07:12:13,828][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: 2017-07-06 07:12:03.085 INFO 26110 --- [nio-8090-exec-2] c.s.b.m.controller.DownloadController : Sending output response..; line: 1, column: 6]>, :data=>"2017-07-06 07:12:03.085 INFO 26110 --- [nio-8090-exec-2] c.s.b.m.controller.DownloadController : Sending output response.."}
[2017-07-06T07:12:13,841][DEBUG][logstash.pipeline ] filter received {"event"=>{"@timestamp"=>2017-07-06T07:12:12.629Z, "offset"=>32218, "@version"=>"1", "beat"=>{"hostname"=>"test", "name"=>"test", "version"=>"5.4.3"}, "input_type"=>"log", "host"=>"test", "source"=>"test.log", "message"=>"2017-07-06 07:12:03.082 INFO 26110 --- [nio-8090-exec-2] c.s.b.m.controller.DownloadController : Downloading Company List_Draft_HQ_v89.1_OD.xlsx_output.csv", "type"=>"log", "tags"=>["_jsonparsefailure", "beats_input_codec_json_applied"]}}
[2017-07-06T07:12:13,843][DEBUG][logstash.pipeline ] filter received {"event"=>{"@timestamp"=>2017-07-06T07:12:12.629Z, "offset"=>32345, "@version"=>"1", "input_type"=>"log", "beat"=>{"hostname"=>"test", "name"=>"test", "version"=>"5.4.3"}, "host"=>"test", "source"=>"test..log", "message"=>"2017-07-06 07:12:03.085 INFO 26110 --- [nio-8090-exec-2] c.s.b.m.controller.DownloadController : Sending output response..", "type"=>"log", "tags"=>["_jsonparsefailure", "beats_input_codec_json_applied"]}}
Here is my logstash config:
input {
beats {
port => "5044"
host => "0.0.0.0"
codec => "json"
}
}
output {
elasticsearch { hosts => ["10.40.11.130:9200"] }
stdout {codec => json }
}
What am I missing? Should I add a filter?
Thanks!