Hello everyone. My logstash conf
input {
file {
path => ["/usr/local/airflow/logs/*/*/*/*.log"]
codec => "line"
}
}
filter {
grok {
match => { "path" => "/usr/local/airflow/logs/(?<dag_id>.*?)/(?<run_id>.*?)/(?<task_id>.*?)/(?<try_number>.*?).log$" }
}
mutate {
add_field => {
"log_id" => "%{[dag_id]}-%{[run_id]}-%{[task_id]}-${[try_number]}"
}
}
}
output {
elasticsearch {
hosts => ["http://xx.xx.x.xxx:9200"]
index => "airflow-logs"
user => 'elastic'
password => 'default'
}
}
Logstash log
[2025-08-22T07:16:29,669][DEBUG][logstash.inputs.file ] Received line {:path=>"/usr/local/airflow/logs/dag_id=abonent_info_delivery_janissary/run_id=manual__2025-08-22T07:13:29.991136+00:00/task_id=abonent_1_3-pipeline.tsv_to_incremental/attempt=1.log", :text=>"PROCESS STARTED - CONVERT TSV-FILES FROM HDFS://DATA/ABONENT_INFO_DELIVERY_JANISSARY/TSV/\xD0\x9C\xD0\xA2\xD0\xA1_3/2025-08-22/1078317 TO HBASE AND SAVE TO HDFS://DATA/ABONENT_INFO_DELIVERY_JANISSARY/INCREMENTAL/\xD0\x9C\xD0\xA2\xD0\xA1_3/2025-08-22/1078317"}
But that's it! Then there is no sending to elastic. Please help me
Kibana Grok Debugger