Hi, i’m running a pipeline with Grok filter and date filter. The date filter is running good. Even if my grok matches the logs, on elastic I don't see them parsed, it's like it ignores my grok filter.
This is the pipeline :
input {
beats {
port => 19044
}
}
filter {
grok {
match => {
"message" => [
"%{TIMESTAMP_ISO8601:timestamp}", "%{LOGLEVEL:loglevel}", "%{HOSTNAME:server}", "%{WORD}", "%{WORD}", "%{GREEDYDATA:message}", "%{GREEDYDATA}",
"\[%{TIMESTAMP_ISO8601:timestamp}\]\[%{NOTSPACE:loglevel}\]\[%{DATA}\s*\]%{GREEDYDATA:message}",
"%{DATE:date}_%{TIME:time}\s+\[%{USERNAME}\]\s+%{WORD:loglevel}\s+%{GREEDYDATA:message}",
"%{MONTHNUM:monthnum}%{MONTHDAY:monthday} %{TIME:time} %{LOGLEVEL:loglevel}\s+\[%{DATA}\]\s+%{GREEDYDATA:message}",
"%{TIMESTAMP_ISO8601:timestamp}\s+%{NOTSPACE:loglevel}\s+\[%{DATA}\s*\]%{GREEDYDATA:message}",
"\[%{TIMESTAMP_ISO8601:timestamp}\]\s+%{GREEDYDATA:message}",
"%{TIMESTAMP_ISO8601:timestamp}\s+%{USERNAME:loglevel}\s+\[%{USERNAME}]\s+%{GREEDYDATA:message}"
]
}
}
date {
match => ["timestamp", "YYYY-MM-dd'T'HH:mm:ss.SSSZ", "YYYY-MM-dd HH:mm:ss,SSS", "YYYY-MM-dd HH:mm:ss"]
target => "@timestamp"
}
# Rimuove il campo originale `timestamp` per mantenere pulito il documento
mutate {
remove_field => ["timestamp"]
}
}
output {
elasticsearch {
hosts => ["https://xxxxxxxxxxxxxxxxxxxxx:9200"]
index => "pippo-project"
user => "xxxxxxxxxxxxx"
password => "xxxxxxxxxxxxxxx"
ssl => true
ssl_certificate_verification => true
cacert => "/etc/logstash/new-ingest-ca.crt"
}
}
This is a log on elastic that is not beiing parsed
[2025-11-25 10:09:51] > Applying the new value [ 0 T ] to MTF#4525225/02#RATE_DIFF#ALLUB5#INMNINS
On kibana my message is :
[2025-11-25 10:09:51] > Applying the new value [ 0 T ] to MTF#4525208/00#RATE_DIFF#SISMA2#OUTMNINS
And i dont have any _grokparsefailure
Someone could help plz?