Hi @stephenb - Thank you.
Yes, I am referring to the new elastic serverless forwarder. I am familiar with that documentation and it doesn't seem to address what I am looking for.
That documentation states that the elastic forwarder is able to automatically discover json content in the payload. I am seeing that the forwarder does this, except it keeps the entire json payload in the message field and it doesn't split the message out into individual json events.
I currently have the elastic forwarder hooked up to a kinesis data stream (from a cloudwatch log subscription filter), and here is an example of an ingested json document with an application error message:
{
"messageType": "DATA_MESSAGE",
"owner": "account_id",
"logGroup": "/aws/lambda/lambda_name",
"logStream": "2022/11/18/[$LATEST]xxx",
"subscriptionFilters": ["test_lambda_filter"],
"logEvents": [{
"id": "37215743013839263",
"timestamp": 1668811633088,
"message": "START RequestId: ec57f841-e70b Version: $LATEST\n"
}, {
"id": "3721574308921578",
"timestamp": 1668811636468,
"message": "{\"Timestamp\":\"2022-11-18T22:47:16.3716425+00:00\",\"Level\":\"Information\",\"MessageTemplate\":\"Found credentials using the AWS SDK's default credential search\",\"Message\":\"Found credentials using the AWS SDK's default credential search\",\"SourceContext\":\"AWSSDK\",\"Environment\":\"Development\"}\n"
}, {
"id": "3721574313024915",
"timestamp": 1668811638308,
"message": "System.ArgumentException: Tenant Not Found: Going away\n at ID.Core.Data.IO.TenantContext.FetchTenantInfo(String orgId)\n at ID.Core.Lambda.LambdaBase.GetTenantInformationFromCatalog(String orgId)\n)"
}]
}
I would like this document to be broken down into individual json fields. In the past, using functionbeat with the decode_json_fields
processor would handle all of this and break this up into a document with its respective fields after filtering off of a specific field.
I have tried using the expand_event_list_from_field
with the forwarder, and on this specific json document tried to filter on the logEvents
field - which does narrow down this document to just everything contained in the logEvents
list - but it still doesn't break it down into individual json events/elasticsearch fields. So - does something like the decode_json_fields
processor exist for the elastic forwarder that I could filter off of logEvents
to end up creating multiple key:value fields in elasticsearch?