I have the following multiline json:
"
{"businessId":"335","technicId":"41180883","teller":"progr_1 ","pid":"0024444930","timestamp":"16:03:31.0664","sursa":"test.mbv","nr_linie":"391","mesaj":"MESSAGE PUT TIME IS 16033104"}
{"businessId":"335","technicId":"41180884","teller":"progr_1 ","pid":"0024444930","timestamp":"16:03:31.0665","sursa":"test.mbk","nr_linie":"392","mesaj":"MESSAGE PUT DATE IS 20200902"}
{"businessId":"335","technicId":"11808853","teller":"progr_1 ","pid":"0024444930","timestamp":"16:03:31.0665","sursa":"test.mbx","nr_linie":"401","mesaj":"ended with reason code 000000000 "}
"
and the following logstash conf:
input {
tcp {
port => 9601
codec => multiline {
pattern => "^{"
negate => true
what => previous
}
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "next_index"
}
stdout {
codec => rubydebug
}
}
and I get the following error:
[WARN ] 2020-09-22 05:42:17.480 [[main]>worker7] json - Error parsing json {:source=>"message", :raw=>"52 <13>1 2020-09-22T05:41:45+03:00 bndikdev}, - - - \n263 <13>1 2020-09-22T05:41:45+03:00 bndikdev{"businessId" - - - "335","technicId":"41180883","teller":"progr_1 ","pid":"0024444930","timestamp":"16:03:31.0897","sursa":"opnAAmq1.cbl","nr_linie":"940","mesaj":"TECHNICID = 41180883 "},\n232 <13>1 2020-09-22T05:41:45+03:00 bndikdev{"businessId" - - - "335","technicId":"41180883","teller":"progr_1 ","pid":"0024444930","timestamp":"16:03:31.0897","sursa":"opnAAmq1.cbl","nr_linie":"941","mesaj":"MSGBRCH = 1000"},\n259 <13>1 2020-09-22T05:41:45+03:00 bndikdev {"businessId" - - - "335","technicId":"41180884","teller":"progr_1 ","pid":"0024444930","timestamp":"16:03:31.0897","sursa":"opnAAmq1.cbl","nr_linie":"942","mesaj":"MSGTYPE = OPNAA "}", :exception=>#<LogStash::Json::ParserError: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: (byte)"52 <13>1 2020-09-22T05:41:45+03:00 bndikdev}, - - -
263 <13>1 2020-09-22T05:41:45+03:00 bndikdev{"businessId" - - - "335","technicId":"41180884","teller":"progr_1 ","pid":"0024444930","timestamp":"16:03:31.0897","sursa":"opnAAmq1.cbl","nr_linie":"940","mesaj":"TECHNICID = 11808853 "},
232 <13>1 2020-09-22T05:41:45+03:00 bndikdev{"businessId" - - - "335","technicId":"11808853","teller":"progr_1 ","pid":"0024444930","timestamp":"16:03:31.0897","sursa":""[truncated 320 bytes]; line: 1, column: 5]>}
I also tried to parse the multiline json using the following conf:
input {
tcp {
port => 9601
codec => plain
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "next_index"
}
stdout {
codec => rubydebug
}
}
and I still get json parse failure:
[WARN ] 2020-09-22 03:59:44.495 [[main]>worker0] json - Error parsing json {:source=>"message", :raw=>"250 <13>1 2020-09-22T03:59:44+03:00 bndikdev {"businessId" - - - "335","technicId":"641180883","teller":"progr_1 ","pid":"0024444930","timestamp":"16:03:31.0856","sursa":"opnAAmq1.cbl","nr_linie":"370","mesaj":"alternative nespecificate",", :exception=>#<LogStash::Json::ParserError: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: (byte)"250 <13>1 2020-09-22T03:59:44+03:00 bndikdev {"businessId" - - - "335","technicId":"641180883","teller":"progr_1 ","pid":"0024444930","timestamp":"16:03:31.0856","sursa":"opnAAmq1.cbl","nr_linie":"370","mesaj":" alternative nespecificate","; line: 1, column: 6]>}