Logstash and AWS Cloudtrail

Help please. It appears that AWS cloudtrail puts multiple log records under one top level field, like this:

"Records": [
			"eventName": "AssumeRole",
			"requestParameters": {
				"durationSeconds": 1500,
				"roleSessionName": "ElasticMapReduceSession",
				"roleArn": "arn:aws:iam::11111111:role/EMR_DefaultRole"
			"eventTime": "2023-05-20T19:18:27Z",
			"eventID": "5c09b4de-8447-4776-b6e9-af6b4fdb1a02",
			"sourceIPAddress": "elasticmapreduce.amazonaws.com",
			"eventSource": "sts.amazonaws.com",
			"sharedEventID": "e9194403-2ab4-4151-9eb4-23f92f606197"
			"eventName": "GetBucketAcl",
			"requestParameters": {
				"acl": "",
				"bucketName": "bucket",
				"Host": "bucket.s3.us-east-2.amazonaws.com"
			"resources": [
					"accountId": "111111111",
					"type": "AWS::S3::Bucket",
					"ARN": "arn:aws:s3:::bucket"
			"eventTime": "2023-05-20T19:18:30Z",
			"eventID": "98ffe5eb-b9f0-4767-8cb0-7cc2554e1fce",

This is how the logstash S3 input with cloudtrail plugin reads it in as well, and doesn't, as far as I can tell, handle the fact that there are 2+ disparate log entries in this one record. It treats everything under the top level [records] as one log, when its not. Surely, someone has solved for this right?
What needs to happen is AssumeRole should a separate event from the GetBucketAcl event. How can I make that happen?

Perhaps take a look at this from our resident logstash expert

And here is split

LOL, that was absurdly more simple than I had been thinking.

Though an important thing is to delete event.original, else it is copies to every split entry which makes the output huge.

split {
            field => "Records"
            remove_field => ["[event][original]"]

It would be cheaper to do the mutate+remove_field before the split. Fixing this when overwriting the source with the target in the split filter was a significant performance enhancement :smile: