Parsing Azure json multiline logs by filebeat

Hello,
Please advise is it possible to parse by filebeat json multiline logs with comma delimiters and send it to Elasticsearch?

Json files sctucture are the next:

{
"time": "2017-02-28T12:04:25.7263914Z",
"resourceId": "/SUBSCRIPTIONS/...../RESOURCEGROUPS/....",
"operationName": "MICROSOFT.RESOURCES/SUBSCRIPTIONS/RESOURCEGROUPS/DELETE",
"category": "Delete",
"resultType": "Start",
"resultSignature": "Started.",
"durationMs": 0,
"callerIpAddress": "....",
"correlationId": "....",
"identity": {"authorization":{"scope":"....}},
"level": "Information",
"location": "global"
}
,
{
"time": "2017-02-28T12:04:27.3201073Z",
"resourceId": "/SUBSCRIPTIONS/.../RESOURCEGROUPS/...",
"operationName": "MICROSOFT.RESOURCES/SUBSCRIPTIONS/RESOURCEGROUPS/DELETE",
"category": "Delete",
"resultType": "Accept",
"resultSignature": "Accepted.Accepted",
"durationMs": 1586,
"callerIpAddress": "....",
"correlationId": "....",
"identity": {"authorization":{"..."}},
"level": "Information",
"location": "global",
"properties": {"statusCode":"Accepted","serviceRequestId":null}
}
,
ens so on.

filebeat generates an errors all the time:

2017-02-28T15:29:30Z ERR Error decoding JSON: unexpected EOF
2017-02-28T15:29:30Z ERR Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}
2017-02-28T15:29:30Z ERR Error decoding JSON: invalid character ',' looking for beginning of value

The above is currently not possible in filebeat. That is why I recommended LS as here you could potentially do some more processing: Use logstash or filebeat for sending azure JSON logs?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.