I am able to get logs going to elasticsearch using the following: /etc/filebeat/conf.d/lwrp-prospector-airflow.yml
---
filebeat:
prospectors:
- paths:
- "/var/log/airflow/*.log"
enabled: true
But I want to chop up the log line and came up with a grok pattern to do so.
So I copied an existing module (apache2) to a new folder and modified the contents including adding my grok pattern to /usr/share/filebeat/module/airflow/dags/ingest/default.json
{
"description": "Pipeline for parsing Airflow logs.",
"processors": [{
"grok": {
"field": "message",
"patterns":[
"\[%{TIMESTAMP_ISO8601:timestamp}\] \{\{%{NOTSPACE:scriptname}:%{NUMBER:dag_id}}\} %{LOGLEVEL:level} - %{GREEDYDATA:airflow_message}"
],
"ignore_missing": true
}
},{
"remove":{
"field": "message"
} ......
But now I'm stuck. How do I tell the prospector to use this processor from my custom module?
Any help is appreciated.
ubuntu: 16.04
filebeat 6.0.1