I have filtered my log message using grok. But when I check Kibana, I find the new fields on the left side of the page, but they are empty. I am also getting the _grokparsefailure tag.
Here's an example of my log message:
[2022-09-28 18:11:25,144] {processor.py:641} INFO - Processing file /opt/airflow/dags/dag_filtered.py for tasks to queue
[2022-09-23 22:51:02,857] {logging_mixin.py:115} INFO - [2022-09-23 22:51:02,857] {dag.py:2379} INFO - Sync 1 DAGs
I am guessing that the error is because they have different patterns? I tried to fix this by having multiple match patterns in my logstash configuration(shared below) but I still get the _groksparse failure in my tag.
Add handling for parse failure to know which a line/data cause an error.
You will have the field: "original" in version ELK 8+, or do not overwrite the "message" field.
Avoid GREEDYDATA, use DATA which is faster.
output {
if "_grokparsefailure" in [tags] {
elasticsearch {
hosts => ["${IP}:9200"]
index =>"logss-%{+YYYY.MM.dd}"
}
# or save in a file
file { path => "/path/grokfailure_%{+YYYY-MM-dd}.txt" }
}
else {
elasticsearch {
hosts => ["${IP}:9200"]
index =>"logss-%{+YYYY.MM.dd}"
}
}
}
If that is one of your pattern it should always match and you should never get a _grokparsefailure tag. That suggests that you are not running with the configuration that you think you are.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.