How can I use multiline for this log

Log file -

25.08.2024 13:28:50.972 *DEBUG* [127.0.0.1 [1724326130527] GET /content/smple/global/locations.html HTTP/1.1] com.custom.global.core.models.impl.LocationCFListImpl damContentFragment name : em
25.08.2024  10:14:03.157 [pod-1] *WARN* [79.140.117.35 [1724408043138] POST /content/core-components/us/en.qbank-upload.json HTTP/1.1] org.apache.sling.servlets.post.impl.SlingPostServlet Exception while handling POST on path [/content/core-components/us/en.qbank-upload.json] with operation [org.apache.sling.servlets.post.impl.operations.ModifyOperation]
org.apache.sling.api.resource.PersistenceException: Unable to create node at /content/core-components/us/en.qbank-upload.json
    at org.apache.sling.jcr.JcrResourceProvider.create(JcrResourceProvider.java:487) [org.apache.sling.jcr.resource:3.3.2]
    at org.apache.sling.impl.AuthenticatedResourceProvider.create(AuthenticatedResourceProvider.java:201)
    at org.apache.sling.servlets.post.impl.AbstractPostOperation.run(AbstractPostOperation.java:103) [org.apache.sling.servlets.post:2.6.0]
25.08.2024 13:28:50.982 *DEBUG* [127.0.0.1 [1724326130527] GET /content/ample/global/locations.html HTTP/1.1] com.custom.global.core.models.impl.LocationCFListImpl Size : 1

And logstash config -

input {
        file {
              path => "/usr/share/logstash/data/logs/**/*.log"
              start_position => "beginning"
              sincedb_path => "/dev/null"
              codec => multiline {
                        pattern => "(^d+serror)|(^.+Exception: .+)|(^\s+at .+)|(^\s+... d+ more)|(^\s*Caused by:.+)"
                        #negate => true
                        what => "previous"
                      }
        }
}

filter {
  grok {
    match => {
      "message" => [
       "%{DATESTAMP:log_date} \[%{DATA:pod_id}\] \*%{LOGLEVEL:log_level}\* \[%{DATA:thread_name}\] %{JAVACLASS:logger} %{GREEDYDATA:message}(?:\n%{GREEDYDATA:stack_trace})?",
       "%{DATESTAMP:log_date} \*%{LOGLEVEL:loglevel}\* \[%{IP:client_ip} \[%{NUMBER:session_id}\] %{WORD:method} %{URIPATH:request_path}\ HTTP/%{NUMBER:http_version}\] %{JAVACLASS:logger} %{GREEDYDATA:message}"
       ]
    }
  }
date {
      match => ["log_date", "dd.MM.yyyy HH:mm:ss.SSS"]
      target => "@timestamp"
      timezone => "Europe/Berlin"  # Adjust this to your timezone if necessary
    }
  mutate {
    remove_field => ["timestamp"]
  }
}
output {
        stdout { codec => rubydebug }
        elasticsearch {
                hosts => "elasticsearch:9200"
                index => "test-app-logs"
        }
}


Sorry I cannot make it work, the output - Only one event is shown instead of 3 events.

 "message": [
    "25.08.2024 13:28:50.972 *DEBUG* [127.0.0.1 [1724326130527] GET /content/sample/global/locations.html HTTP/1.1] com.custom.global.core.models.impl.LocationCFListImpl damContentFragment name : em",
    "damContentFragment name : em"
  ]

And if I modify the log entry manually when application is running on docker, lets say replace org.apache with org.hello then I see 3 events but not correct message data, one of the event is combining message data of another event.
Please suggest.

You might do better with

 codec => multiline {
      auto_flush_interval => 2 
      pattern => "^\d{2}.\d{2}.\d{4}\s+\d{2}:\d{2}:\d{2}" 
      negate => true 
      what => previous 
 }

Thank you for the help!
Now it is better result after this change, it shows 2 events but I still see "_grokparsefailure" on the second line log entry.

DATESTAMP only allows a single hyphen or space between the date and time. 25.08.2024 10:14:03.157 has two spaces, not one. You could add

 pattern_definitions => { "MYDATESTAMP" => "%{DATE}(-| +)%{TIME}" }

to the grok options and reference that instead of DATESTAMP in your patterns.