I have filtered my log message using grok. But when I check Kibana, I find the new fields on the left side of the page, but they are empty. I am also getting the _grokparsefailure
tag.
Here's an example of my log message:
[2022-09-28 18:11:25,144] {processor.py:641} INFO - Processing file /opt/airflow/dags/dag_filtered.py for tasks to queue
Here's my logstash config file:
input {
beats {
port => 5044
codec => "line"
}
}
filter{
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}]%{DATA:class} %{SPACE}%{LOGLEVEL:loglevel} -%{GREEDYDATA:logMessage}" }
overwrite => [ "message" ]
}
date {
match => [ "timestamp", "MMM dd yyyy HH:mm:ss", "MMM d yyyy HH:mm:ss", "ISO8601" ]
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => ["${IP}:9200"]
index =>"logss-%{+YYYY.MM.dd}"
}
}
And here's my filebeat configuration:
filebeat.inputs:
- type: filestream
id: my-filestream-id
enabled: true
paths:
- /home/ubuntu/logs/**/*.log
filebeat.config.modules:
path: /etc/filebeat/modules.d/*.yml
reload.enabled: false
setup.template.settings:
index.number_of_shards: 1
output.logstash:
hosts: ["${ip}:5044"]
processors:
- add_host_metadata:
when.not.contains.tags: forwarded
- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~
- drop_fields:
fields: ["agent", "cloud", "ecs", "host", "input", "tags", "log.offset"]
ignore_missing: true
When I test my log message and the grok pattern I have on Grok Debugger, it works fine. So what am I missing?