I'm using filebeat 7.12.0 to ingest docker logs using the container input. Log entries in the docker json-file format look like:
{"log":"{\"@timestamp\":\"2022-02-15T17:45:26.742Z\",\"log.level\":\"info\",\"message\":\"Auditor starting\",\"ecs\":{\"version\":\"1.6.0\"},\"log\":{\"logger\":\"cureatr.config.standalone\",\"origin\":{\"file\":{\"line\":237,\"name\":\"standalone.py\"},\"function\":\"process\"}},\"process\":{\"name\":\"MainProcess\",\"pid\":9,\"thread\":{\"id\":139770174854976,\"name\":\"MainThread\"}},\"service.name\":\"auditor\"}\n","stream":"stderr","time":"2022-02-15T17:45:26.743062038Z"}
Using this filebeat config:
- json.add_error_key: true
json.expand_keys: true
json.keys_under_root: true
json.message_key: log
json.overwrite_keys: true
paths:
- /site/docker/containers/*/*.log
type: container
filebeat fails to parse the logs with this error:
2022-02-15T17:50:37.807Z ERROR [jsonhelper] jsontransform/jsonhelper.go:34 JSON: failed to expand fields: cannot expand "log.level": found conflicting key
If I comment out json.expand_keys: true
then filebeat fails to parse the logs with:
2022-02-15T17:45:33.657Z WARN [elasticsearch] elasticsearch/client.go:408 Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0x2c3a0580, ext:63780543926, loc:(*time.Location)(nil)}, Meta:null, Fields:{"agent":{"ephemeral_id":"f4f93b64-0f33-4d41-8cb8-a8130c20f7e6","hostname":"playtools1","id":"b6040b83-f25b-4729-9292-59cb4b7c73da","name":"playtools1","type":"filebeat","version":"7.12.0"},"ecs":{"version":"1.6.0"},"error":{"message":"Value of key 'log' is not a string","type":"json"},"host":{"name":"playtools1"},"input":{"type":"container"},"log":"","log.level":"info","message":"Auditor starting","process":{"name":"MainProcess","pid":9,"thread":{"id":139770174854976,"name":"MainThread"}},"service.name":"auditor","stream":"stderr"}, Private:file.State{Id:"native::534643-66306", PrevId:"", Finished:false, Fileinfo:(*os.fileStat)(0xc000117a00), Source:"/site/docker/containers/800b50f4f25eda132bc56cfc62746cfa2dff8af47852e1508f84fd0785f12c7b/800b50f4f25eda132bc56cfc62746cfa2dff8af47852e1508f84fd0785f12c7b-json.log", Offset:986, Timestamp:time.Time{wall:0xc07b178f3b188019, ext:310110818695, loc:(*time.Location)(0x4163ac0)}, TTL:-1, Type:"container", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x82873, Device:0x10302}, IdentifierName:"native"}, TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=400): {"type":"mapper_parsing_exception","reason":"object mapping for [log] tried to parse field [log] as object, but found a concrete value"}
Is there any way to make filebeat parse these logs as json and expand keys?