Hello,
I'm trying to run ELK stack on Cloud using kubernetes and ship logs and metrics to it with beats. Everything is installed using Kubernetes since everything is on the cloud and ELK is used to mintor kubernetes.
Currently I have installed Elasticsearch successfully, Ingest, Kibana, and Metricbeat. That worked quite well. I'm trying to add logstash for further filtering but first I want to run it with the simplest configuration to see how it works.
It works partially but there're 2 errors that I can't seem to get around.
ELK version: 6.3.2
Errors in logstash:
1.
[2018-08-28T13:17:08,195][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"metricbeat-6.3.2-2018.08.28", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x6ed25be7>], :response=>{"index"=>{"_index"=>"metricbeat-6.3.2-2018.08.28", "_type"=>"doc", "_id"=>"eI-ugGUB6pUszH98PL6n", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [kubernetes.labels.app]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:680"}}}}}
[2018-08-29T07:20:02,007][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"metricbeat-6.3.2-2018.08.29", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x76960f17>], :response=>{"index"=>{"_index"=>"metricbeat-6.3.2-2018.08.29", "_type"=>"doc", "_id"=>"CSqNhGUBnQ8ZNZeJrGCb", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [kubernetes.labels.app] tried to parse field [app] as object, but found a concrete value"}}}}
Logstash.yaml:
apiVersion: v1
kind: ConfigMap
metadata:
name: logstash-application
namespace: elk-test
data:
logstash.conf: |-
input {
beats {
port => 5044
ssl => true
ssl_certificate_authorities => ["/etc/server.crt"]
ssl_certificate => "/etc/server.crt"
ssl_key => "/etc/server.key"
ssl_verify_mode => "force_peer"
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
user => user
password => password
sniffing => false
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
logstash.yml: |-
http.host: "0.0.0.0"
path.config: /usr/share/logstash/pipeline
xpack.monitoring.enabled: false
Logstash already ships a lot of metrics to Elasticsearch but I would like to get around the errors that pop up as mentioned earlier.
Error Cause!!
Metricbeat logs:
"pod":{"name":"connector-reports-nginx-0"}} meta:{"kubernetes":{"container":{"name":"nginx"},"labels":{"app":{"kubernetes":{"io/component":"nginx-server","io/name":"connector-reports"}}
meta:{"kubernetes":{"container":{"name":"kiosegri-test-master-svc"},"labels":{"app":"kiosegri-test-master-svc","name":"master","pod-template-hash":"337889564"},"namespace":"kiosegri-test"
Is there a way to mutate the field and get the component as the result?
Thanks a lot and if there's anything else needed I'd be happy to provide it.