I am trying to understand ES ingest pipelines and Filebeat.
I am using version 7.11 ES and K and FB in docker containers.
If I do not add the pipeline, I get the log lines in ES/K from FB, but they are not formatted very well, (It's all indexed into a 'message' keyword. So I want to create a pipeline to further breakdown the values in the log lines into keywords using 'tab' separators.
I have created a pipeline in Kibana Dev Tools as:
PUT _ingest/pipeline/rcp_log_pipeline_tab
{
"description" : "rcp log tab pattern",
"processors" : [
{
"csv" : {
"field" : "message",
"target_fields" : [
"timestamp",
"relativeTime",
"thread",
"processName",
"sourceName",
"logType",
"logMessage"
],
"separator" : " "
}
},
{
"rename" : {
"field" : "timestamp",
"target_field" : "@timestamp"
}
},
{
"rename" : {
"field" : "@timestamp",
"target_field" : "index_timestamp"
}
}
]
}
and I have simulated it with
POST _ingest/pipeline/rcp_log_pipeline_tab/_simulate
{
"docs": [
{
"_source": {
"message": "2021-01-02T00:01:00.134-08:00 1047176101054 0x0017 US-W10L2.Axxion.ToolSpud. IoProvider Background Performing BankReadIOPoints"
}
}
]
}
And I get the results I am expecting. Perfect!
So I updated my filebeat.yml to use this pipeline
output.elasticsearch:
hosts: ['${ELASTICSEARCH_HOST_PORT}']
username: '${ELASTIC_USERNAME}'
password: '${ELASTIC_PASSWORD}'
pipeline: 'rcp_log_pipeline_tab'
deleted the filebeat-* index in Kibana, removed the registry in Filebeat so it will resend the log files, restarted the Filebeat container and I get nothing from Filebeat. The index is created, but it is empty.
What I find scouring the web seems to indicate this is what I should be doing.. so why is it not working?
Thanks for taking a look!