Logstash doesn't parse message into separate parts

I'm trying to parse my error.log file through logstash into elasticsearch but the message shows up in kibana as a whole without being split. I am not sure if my logstash conf file is wrong or if the error is elsewhere.
logstash .conf file

input {
  beats {
    port => 5044
  }
}
filter {
  grok {
    match => {
    "message" => "\[%{DAY:day} %{MONTH:month} %{MONTHDAY:monthday} %{TIME:time} %{YEAR:year}\] \[%{WORD:log_type}:%{LOGLEVEL:log_level}\] \[pid %{NUMBER:pid}\] (?:\[client %{IP:client_ip}:%{NUMBER:client_port}\] ){0,1}%{GREEDYDATA:message}"
    }
  }
}
output {
  elasticsearch {
    hosts => ["localhost:9200"]
    user => "elastic"
    password => "password"
  }
}

Please provide examples of the messages you are trying to parse. Do the documents in elasticsearch have a _grokparsefailure tag?

The messages are currently going through the filebeat-* index instead of logstash. I previously connected filebeat directly to elasticsearch.
Message example

[Sun Nov 03 18:31:28.085085 2024] [security2:error] [pid 543501] [client 127.0.0.1:43240] ModSecurity: Warning. Invalid URL Encoding: Non-hexadecimal digits used at TX:0. [file "/usr/share/modsecurity-crs/rules/REQUEST-920-PROTOCOL-ENFORCEMENT.conf"] [line "427"] [id "920221"] [msg "URL Encoding Abuse Attack Attempt"] [data ""] [severity "CRITICAL"] [ver "OWASP_CRS/4.9.0-dev"] [tag "application-multi"] [tag "language-multi"] [tag "platform-multi"] [tag "attack-protocol"] [tag "paranoia-level/1"] [tag "OWASP_CRS"] [tag "capec/1000/255/153/267/72"] [hostname "localhost"] [uri "/DVWA/vulnerabilities/sqli/"] [unique_id "ZydRALr8WrdfkM5o6WVj4AAAAAI"], referer: http://localhost/DVWA/vulnerabilities/sqli/

That sounds like your events are still going directly from filebeat to elasticsearch.

The grok works if the message goes through that pipeline. I would suggest adding overwrite => [ "message" ]. If you do not then [message] will be an array, with one entry being the original log line, and the second entry being whatever matches the trailing GREEDYDATA in the grok pattern. Having an array like that is unlikely to be useful.

If you want to parse out more of [message] then I would suggest using ruby

    ruby {
        code => '
            matches = event.get("message")&.scan(/ \[(\w+) "([^"]+)"\]/)
            matches.each { |k, v|
                newK = "[stuff][#{k}]"
                if event.include?(newK)
                    a = Array(event.get(newK))
                    a << v
                    event.set(newK, a)
                else
                    event.set(newK, v)
                end
            }
        '
    }

which will produce

      "stuff" => {
           "id" => "920221",
     "severity" => "CRITICAL",
          "msg" => "URL Encoding Abuse Attack Attempt",
          "tag" => [
        [0] "application-multi",
        [1] "language-multi",
        [2] "platform-multi",

etc.

Does this mean that there is an error in my logstash config that's why the logs are bypassing logstash?

No, filebeat will send logs to either logstash or elasticsearch. If it is sending them to elasticsearch it is because the configuration is telling it to do so. You may not be running the filebeat configuration you think you are.