Hey,
i had an Issue with different docker-compose Versions and parse the Metadata.
(Infrastructure is: Filebeat -> Logstash -> Elasticsearch)
First the Error Message from logstash:
failed to parse field [docker.container.labels.com.docker.compose.project] of type [text]
docker-inspect from one machine:
"Labels": {
"com.docker.compose.config-hash": "c19ea02a165f8a57ab71375089cbe695f1954aa33391f691391eeb8d1eee644b",
"com.docker.compose.container-number": "1",
"com.docker.compose.oneoff": "False",
"com.docker.compose.project": "import_execution",
"com.docker.compose.service": "proxy",
"com.docker.compose.version": "1.24.0",
"maintainer": "company"
}
from another docker-compose version:
"Labels": {
"com.docker.compose.config-hash": "7c6d5f1b83896e155f2605a38e09ab6d2cb4025f5603da0f40aa56fcb46964e4",
"com.docker.compose.container-number": "1",
"com.docker.compose.oneoff": "False",
"com.docker.compose.project": "integration-environment",
"com.docker.compose.project.config_files": "docker-compose.yml",
"com.docker.compose.project.working_dir": "XYZ",
"com.docker.compose.service": "proxy",
"com.docker.compose.version": "1.26.2",
"maintainer": "company"
}
the first Index which was created with this log adds com.docker.compose.project as String, the second Environment try to insert the values from com.docker.compose.project as an "object".
Is there a way to handle this Problem? Because this can happen to future updates from docker-compose / docker as well and we don't want to reindex/refresh all Log Indicies.
i tried to manage this via json.keys_under_root true but fields are always structured as an Object in Kibana.
filebeat.yml
filebeat.inputs:
- type: docker
enabled: true
containers.ids: '*'
json.keys_under_root: true
json.add_error_key: true
multiline.pattern: ^[0-9]{4}
multiline.negate: true
multiline.match: after
processors:
- add_docker_metadata: ~