I'm working on ingesting DB2 logs from db2diag.log into elastic. Here's the steps I have taken so far, and how much is working:
Index Template - create an index template logs-db2.diag with index pattern logs-db2.diag-*
Fleet Policy - Added custom integration to existing policy for db2 servers
Integration settings of interest
log path /home/db2inst1/.../db2diag.log
datasetname db2.diag
custom config - multiline matching on date
ingest pipeline
logs-db2.diag-2.3.3 (does not appear to be invoked)
logs-db2.diag@custom (does not appear to be invoked)
mappings
logs-db2.diag@package
Pipeline
logs-db2.diag@custom
processor
append - adds a simple field for debug purposes
pipeline - global@custom
global@custom
painless script to add "solution.stack" = "stage" based on agent hostname
What's working:
I get documents all of the way into logs-db2.diag-default. I can see them in discovery. This tells me that:
Fleet agent is working
Fleet agent is ingesting db2diag.log
multiline is working (multiline messages are correctly parsed)
The index template is found and used
The datastream was created
global@custom is working in other pipelines, such as apache_tomcat integration
What's not working:
The pipelines defined in the integration policy do not appear to be invoked.
Additional testing
I have put (through the console) test documents including "message" through the logs-db2.diag@custom pipeline to another index using POST my_test_index/_doc?pipeline=logs-db2.diag@custom. This was successful
I'm new to ELK and this is a POC environment. In addition to answering this question, can somebody point me in the direction of good debugging tools for pipelines, such as invocation counts?