Grok doesn't work on Elastic but on dubugger yes

Hi, i’m running a pipeline with Grok filter and date filter. The date filter is running good. Even if my grok matches the logs, on elastic I don't see them parsed, it's like it ignores my grok filter.

This is the pipeline :

input {
  beats {
    port => 19044
  }
}

filter {
  grok {
    match => {
      "message" => [
        "%{TIMESTAMP_ISO8601:timestamp}", "%{LOGLEVEL:loglevel}", "%{HOSTNAME:server}", "%{WORD}", "%{WORD}", "%{GREEDYDATA:message}", "%{GREEDYDATA}",
        "\[%{TIMESTAMP_ISO8601:timestamp}\]\[%{NOTSPACE:loglevel}\]\[%{DATA}\s*\]%{GREEDYDATA:message}",
        "%{DATE:date}_%{TIME:time}\s+\[%{USERNAME}\]\s+%{WORD:loglevel}\s+%{GREEDYDATA:message}",
        "%{MONTHNUM:monthnum}%{MONTHDAY:monthday} %{TIME:time} %{LOGLEVEL:loglevel}\s+\[%{DATA}\]\s+%{GREEDYDATA:message}",
        "%{TIMESTAMP_ISO8601:timestamp}\s+%{NOTSPACE:loglevel}\s+\[%{DATA}\s*\]%{GREEDYDATA:message}",
        "\[%{TIMESTAMP_ISO8601:timestamp}\]\s+%{GREEDYDATA:message}",
        "%{TIMESTAMP_ISO8601:timestamp}\s+%{USERNAME:loglevel}\s+\[%{USERNAME}]\s+%{GREEDYDATA:message}"
      ]
    }
  }

  date {
    match => ["timestamp", "YYYY-MM-dd'T'HH:mm:ss.SSSZ", "YYYY-MM-dd HH:mm:ss,SSS", "YYYY-MM-dd HH:mm:ss"]
    target => "@timestamp"
  }

  # Rimuove il campo originale `timestamp` per mantenere pulito il documento
  mutate {
    remove_field => ["timestamp"]
  }

}

output {
  elasticsearch {
    hosts => ["https://xxxxxxxxxxxxxxxxxxxxx:9200"]
    index => "pippo-project"
    user => "xxxxxxxxxxxxx"
    password => "xxxxxxxxxxxxxxx"
    ssl => true
    ssl_certificate_verification => true
    cacert => "/etc/logstash/new-ingest-ca.crt"
  }
}

This is a log on elastic that is not beiing parsed

[2025-11-25 10:09:51] > Applying the new value [ 0 T ] to MTF#4525225/02#RATE_DIFF#ALLUB5#INMNINS

On kibana my message is :

[2025-11-25 10:09:51] > Applying the new value [ 0 T ] to MTF#4525208/00#RATE_DIFF#SISMA2#OUTMNINS

And i dont have any _grokparsefailure

Someone could help plz?

because this is the first match ?

Ok now i understand. So how i can grok this log?

"2025-02-26 02:07:29,729", "INFO", "pippo.pluto.it", "esr", "uds", "Loaded 2212 IDs for PIPS", ""

i tried to remove all the first line on grok and problems persist @RainTown u have any suggest?

well, others on here are better at me at grok patterns in logstash, so ….

btw the 2 specific logs you reference look to me quite different.

You should use the csv or the dissect filter, it's much easier in this case.