I'm pulling my hair out, please help!! I have a very basic setup:
Filebeat (7.6.2) sending docker logs to elasticsearch (7.6.2) everything is working exactly as expected EXCEPT the actual docker logs are not being decoded to provide the different pieces of information such as IP etc.
A snippet of my config from filebeat.yml:
filebeat.inputs:
- type: container
paths:
- "/var/lib/docker/containers/*/*.log
in kibana i have the complete log message in the message
field. An extract from the JSON is here:
"stream": "stdout",
"message": "49.38.129.6:56661 - - [21/Apr/2020:14:33:23 +0000] \"POST /xxxxx HTTP/1.1\" 200 4271 \"-\" \"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)\" 1.2.3.4",
"log": {
"file": {
"path": "/var/lib/docker/containers/c4948ade03f1f76735cd3a8768ff25754935332f0f021b65118f72e583f1c426/c4948ade03f1f76735cd3a8768ff25754935332f0f021b65118f72e583f1c426-json.log"
},
"offset": 6293394045
},
now i have tried various combination of the json_* options including json.keys_under_root: true
but when this is enabled i receive tons of error meesages in the logs such as:
Error decoding JSON: json: cannot unmarshal number into Go value of type map[string]interface {}
All i'm looking for is to decode the log mesage and enrich the data with geoip information to build some nice dashboards.
I'm obviously missing something incredible simple but just cant see it.
please help!!