Need help with grokking complex logfile

Hello! If anyone could provide assistance with setting up the correct grok filter for a complex logfile like this:

======== Query 1 of 2 ========
/* [User: 239] events :: edit */ SELECT `Attribute`.`type`, `Attribute`.`category`, `Attribute`.`object_relation`, (CASE WHEN `Attribute`.`value2` = '' THEN `Attribute`.`value1` ELSE CONCAT(`Attribute`.`value1`, '|', `Attribute`.`value2`) END) AS `Attribute__value`, `Attribute`.`object_id` FROM `misp`.`attributes` AS `Attribute` WHERE `Attribute`.`deleted` = 0 AND `Attribute`.`object_id` IN ();
---- Occurred 3 times ----
# Time: 2025-02-18T04:59:14.453682Z | # Query_time: 593.916331  Lock_time: 0.888101 Rows_sent: 8446729  Rows_examined: 339805554
# Time: 2025-02-18T11:01:00.940948Z | # Query_time: 623.546214  Lock_time: 0.894550 Rows_sent: 8451339  Rows_examined: 340046530
# Time: 2025-02-18T12:54:00.932372Z | # Query_time: 605.223335  Lock_time: 0.884287 Rows_sent: 8452729  Rows_examined: 340108469

======== Query 2 of 2 ========
/* [User: 98] objects :: restSearch */ SELECT `Attribute`.`id`, `Attribute`.`event_id`, `Attribute`.`object_id`, `Attribute`.`object_relation`, `Attribute`.`category`, `Attribute`.`type`, `Attribute`.`value1`, `Attribute`.`value2`, `Attribute`.`to_ids`, `Attribute`.`uuid`, `Attribute`.`timestamp`, `Attribute`.`distribution`, `Attribute`.`sharing_group_id`, `Attribute`.`comment`, `Attribute`.`deleted`, `Attribute`.`disable_correlation`, `Attribute`.`first_seen`, `Attribute`.`last_seen`, (CASE WHEN `Attribute`.`value2` = '' THEN `Attribute`.`value1` ELSE CONCAT(`Attribute`.`value1`, '|', `Attribute`.`value2`) END) AS `Attribute__value` FROM `misp`.`attributes` AS `Attribute` WHERE `Attribute`.`deleted` = 0 AND `Attribute`.`object_id` IN ();
---- Occurred 2 times ----
# Time: 2025-02-18T11:12:03.184746Z | # Query_time: 639.512324  Lock_time: 0.320843 Rows_sent: 5719499  Rows_examined: 340050736
# Time: 2025-02-18T11:27:48.583866Z | # Query_time: 677.457924  Lock_time: 0.494661 Rows_sent: 8186543  Rows_examined: 340053313

that would be very much appreciated. I have tried variations of .conf like:

input {
  file {
    path => "path/file*"
    start_position => "beginning"
    sincedb_path => "path/sincedb"
    codec => multiline {
      pattern => "^======== Query"
      negate => true
      what => "previous"
      auto_flush_interval => 1
    }
  }
}

filter {
  grok {
    match => { "message" => "(?m)^%{GREEDYDATA:whole_log}$" }
  }
}

output {
  elasticsearch {
    hosts => ["host"]
    index => "sample"
    user => username
    password => password
    ssl => true
    cacert => path/ca.crt
  }
}

All I want is to have the logfile as a single message. Currently, logstash is parsing into an entry per newline.

You don't need grok at all then. When I try with that logfile and that multiline codec I get two events, not one per line.

Try adding output { stdout { codec => rubydebug { } } } and show us what the events look like.