Hi community, I'm sharing my experience here for ingesting docker logs with fleet agent-setup. My goal is to ingest docker logs for a specific application running on on-premises on a specific system to a custom datastream..
First I had to prepare the docker environment, ensure the application sends logs to stdout, preferably. This will allow the docker integration to function natively.
Next, I wanted to store the logs in a specific datastream of my choosing like logs-gitlab_runner.log-default
In order to store the logs in a custom datastream, it must be created beforehand with api as mentioned in docs
PUT _data_stream/logs-gitlab_runner.log-default
With the new datastream in place, a fleet agent for Docker Integration can be created. In this new fleet agent, make sure to expand Advanced Options and set re-route permission to the new datastream: logs-gitlab_runner.log-default. Note that this not automatically re-route the logs to new datastream.
I also wanted to match one specific container name, so update condition to match a specific container name: ${docker.container.name} == 'gitlab-runner'
Another important thing in the fleet agent config is the yaml processor. My application writes data as json-lines. All the stdout-messages from docker parsed by filebeat goes into message-field and that's what we're working with.
This processor normalizes the json data from application to a new root-field with name gitlab
We also have time-field and use that to match time as well.
- decode_json_fields:
fields: ["message"]
target: "gitlab"
overwrite_keys: true
process_array: false
- timestamp:
field: gitlab.time
layouts:
- '2006-01-02T15:04:05Z'
- '2006-01-02T15:04:05.999Z'
- '2006-01-02T15:04:05.000000Z'
- '2006-01-02T15:04:05.999999999Z'
test:
- '2026-03-20T15:57:27.850496654Z'
Next we must prepare the Ingest Pipeline. My requirement is to send logs to a different datastream. To do this a custom ingest pipeline for the docker integration must be used to re-route data. Create a new custom ingestion pipeline for docker integration with name logs-docker.container_logs@custom and below is the JSON config
[
{
"reroute": {
"destination": "logs-gitlab_runner.log-default",
"if": "ctx?.container?.name?.contains('gitlab-runner')",
"tag": "reroute-gitlab-datastream"
}
}
]
Note that the fields data_stream.dataset and event.dataset won't update and still point to docker.container_logs unless fixed in the integration pipeline!
Hope you found this useful.