Logstash filebeat pipeline to ingest json logs - parse array

Hi,

I'm trying to ingest json log files, running filebeat and logstash, in log file I have transactions with the following format.

I need to parse the logs and send to elastic a few set of fields of each transaction, but "legs" is an array of two json objects, and I need to get some of the fields inside. For example I need to get the content of legs.[0].status and add a new field

Im using json filter in logstash. So far the pipeline is working but I'm not able to capture the content inside the arrays as mentioned.

Transaction logs:
{
"type": "transaction",
"time": 1548106086406,
"path": "/ri/ticketing/v2/tickets/{ticketId}",
"protocol": "http",
"protocolSrc": "8080",
"duration": 29,
"status": "success",
"serviceContexts": [
{
"service": "ticketManagement",
"monitor": true,
"client": "a41c329e-5207-4970-a144-1a9aefcf8935",
"org": null,
"app": null,
"method": "ticket",
"status": "success",
"duration": 29
}
],
"customMsgAtts": {},
"correlationId": "6639465c89f30d6686821faa",
"legs": [
{
"uri": "/ri/ticketing/v2/tickets/9152650204213262392",
"status": 200,
"statustext": "OK",
"method": "GET",
"vhost": null,
"wafStatus": 0,
"bytesSent": 2850,
"bytesReceived": 954,
"remoteName": "127.0.0.1",
"remoteAddr": "127.0.0.1",
"localAddr": "127.0.0.1",
"remotePort": "60704",
"localPort": "8080",
"sslsubject": null,
"leg": 0,
"timestamp": 1548106086377,
"duration": 29,
"serviceName": "ticketManagement",
"subject": "a41c329e-5207-4970-a144-1a9aefcf8935",
"operation": "ticket",
"type": "http",
"finalStatus": "Pass"
},
{
"uri": "/api/v1/troubleTicket/9152650204213262392",
"status": 200,
"statustext": "",
"method": "GET",
"vhost": null,
"wafStatus": 0,
"bytesSent": 1121,
"bytesReceived": 2865,
"remoteName": "tbapi.com.uy",
"remoteAddr": "10.24.135.51",
"localAddr": "10.24.34.132",
"remotePort": "8002",
"localPort": "44820",
"sslsubject": "/OU=Domain Control Validated/CN=*.com",
"leg": 1,
"timestamp": 1548106086377,
"duration": 25,
"serviceName": "ticketManagement",
"subject": "a41c329e-5207-4970-a144-1a9aefcf8935",
"operation": "ticket",
"type": "http",
"finalStatus": null
}
]
}

I accidentally posted this to the wrong thread. Let's try that again ...

mutate { add_field => { "foo" => "%{[legs][0][status]}" } }

will work. Note that foo will be a string and you will have to convert it to integer if that's what you need.

Hi, I was able to solve it using the following setup:

Taking into account it will split the transaction log, in two transaction logs, one per leg. which is useful for my purposes.

input {
beats {
port => 5044
}
}

filter {
json {
source => "message"
}
split {
field => "legs"
}
mutate {
add_field => {
"legid" => "%{[legs][leg]}"
}
remove_field => [ "[message]" ]
}
.....
}

Thanks for your reply !

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.