The logfile contains json log entries like this:
{"priority":"INFO","messageType":"HTTP_RESPONSE","message":{"type":"HTTP_RESPONSE","name":"Create User","status":"200","reason":"OK","response":"{id=1234, name=My Name}"}}
The root level fields are being dynamically indexed, how do I get the fields that are part of the "message" object to also be indexed ?
It seems that the "message" gets treated as "text" type in the Elasticsearch index mapping (see below):
"message": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
Filebeat is setup this way:
filebeat.inputs:
- type: log
enabled: true
paths:
- C:\logs\*
format: json
json.keys_under_root: true
json.add_error_key: true
fields:
index: test
output.logstash:
hosts: ["localhost:5000"]
Logstash is setup like this:
input {
beats {
port => 5000
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
user => elastic
password => changeme
index => "%{[fields][index]}-%{+YYYY.MM.dd}"
}
}
Filebeat published this event to Logstash:
2018-06-29T13:09:36.968+0200 DEBUG [publish] pipeline/processor.go:291 Publish event: {
"@timestamp": "2018-06-29T11:09:36.967Z",
"@metadata": {
"beat": "filebeat",
"type": "doc",
"version": "6.3.0"
},
"prospector": {
"type": "log"
},
"beat": {
"hostname": "myhost",
"version": "6.3.0",
"name": "myhost"
},
"message": {
"reason": "OK",
"status": "200",
"type": "HTTP_RESPONSE",
"name": "Create User",
"response": "{id=1234, name=My Name}"
},
"priority": "INFO",
"input": {
"type": "log"
},
"host": {
"name": "myhost"
},
"messageType": "HTTP_RESPONSE",
"fields": {
"index": "test"
},
"source": "C:\\logs\\test.log"
}