Logstash failing to ingest multi-line json log file... I think

OK. I was able to figure out several things:

First, I am having two problems. That makes learning a new thing rather difficult. The default behavior of logstash it to treat each line as a separate entry. And... This part of the json is not parsing:

  "metrics": [
    {
      "MetricName": "ResourceCount",
      "Timestamp": "2021-12-06T11:29:48.934903",
      "Value": 0,
      "Unit": "Count"
    },
    {
      "MetricName": "ResourceTime",
      "Timestamp": "2021-12-06T11:29:48.934920",
      "Value": 0.8265008926391602,
      "Unit": "Seconds"
    }
  ]

I was able to solve the mutilline thing with the help of this page: Parsing array of json objects with logstash and injesting to elastic

I changed my input to:

file {
            start_position => "beginning"
            path => "/etc/logstash/sample/cctest1.log"
            sincedb_path => "/dev/null"
            codec => multiline {
                pattern => "^({|\[)\s*$"
                negate => true
                auto_flush_interval => 1
                multiline_tag => ""
                what => "previous"
            }
}

So a line that only contains a "{" or a "[" with possible white space after will trigger a new entry.

So it still will not parse the json. I figure I need to do a "split", but I am not understanding that well enough I guess.

The problem looks alot like the the one I referenced above. But "split { field => "someField" }" makes no sense to me.