Parsing issues of JSON within a syslog event

I have a list of Syslog events in a file, that starts as Syslog message but the remaining in JSON format, and i'm unable to find a way to parse it properly.

Syslog Messages format (some logs has 1 type of events, other has 2):

11/25/2018 4:09:40 PM> Device: [Device101], Data:[{"timestamp":1543154978238,"values":[{"id":"Simulation Examples.Functions.Type1","v":95,"q":true,"t":1543154918245}]}]

11/25/2018 4:09:40 PM> Device: [Device101], Data:[{"timestamp":1543154978472,"values":[{"id":"Simulation Examples.Functions.Type1","v":43,"q":true,"t":1543154978253},{"id":"Simulation Examples.Functions.Type2","v":7.84764343E-013,"q":true,"t":1543154978253}]}]

I tried the following Logstash grok parser to remove the syslog header fields, and parse the remaining as JSON, but the JSON part is not being parsed properly with the JSON Filter

input {
file {
path => ["/var/log/test.log"]
sincedb_path => "/var/log/logstash/sincedb"
start_position => "beginning"
}
}

filter {
grok {
match => [ "message", "%{DATA:DATE}> Device: [%{DATA:DeviceName}], Data:%{GREEDYDATA:jsontoparse}"]
}
json {
source => "jsontoparse"
target => "jsontoparse"
}
}

output {
elasticsearch {
index => "scada-%{+YYYY.MM.dd}"
hosts => ["localhost:9200"]
}
}

In the GROK Debugger, i am getting a proper GROK Parsing as per the following:

{
"DATE": "11/25/2018 4:09:40 PM",
"jsontoparse": "[{"timestamp":1543154978238,"values":[{"id":"Simulation Examples.Functions.Type1","v":95,"q":true,"t":1543154918245}]}]",
"DeviceName": "Device101"
}

The JSON part in the Kibana is showing as per the attached picture!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.