Hi all,
We have a new setup of ELK 6.3.0, with the latest 6.3.0 of filebeat and logstash.
We are getting a lot of errors on logs transferring between logstash and ES.
I am unsure why this is, we have tried using the filebeat template loaded into ES via the filebeat export template, and also deleting all the templates. In both cases the logs are not loading into ES.
It appears that ES is expecting hostname: string, not hostname: name: string.
EXAMPLE ERROR
2018-06-20T09:59:40,784][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2018.06.20", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2751b7f7>], :response=>{"index"=>{"_index"=>"filebeat-2018.06.20", "_type"=>"doc", "_id"=>"VIzsHWQBHoHS_AvaRHrY", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:303"}}}}}
EXAMPLE INPUT:
"@timestamp": "2018-06-20T22:14:23.976Z",
"@metadata": {
"beat": "filebeat",
"type": "doc",
"version": "6.3.0",
"pipeline": "filebeat-6.3.0-system-syslog-pipeline"
},
"beat": {
"name": "ip-10-75-127-242",
"hostname": "ip-10-75-127-242",
"version": "6.3.0"
},
"host": {
"name": "ip-10-75-127-242"
},
"message": "Jun 20 22:11:03 ip-10-75-127-242 kubelet[2155]: I0620 22:11:03.178104 2155 server.go:796] GET /metrics: (3.221285ms) 200 [[python-requests/2.19.1] 127.0.0.1:64936]",
"source": "/var/log/syslog",
"offset": 7358444,
"prospector": {
"type": "log"
},
"input": {
"type": "log"
},
"fileset": {
"module": "system",
"name": "syslog"
}
}