Hello guys, I'm new here.
At first I just wanted to use elasticsearch and filebeat to centralize my servers logs and integrate them on my icinga web.
I installed filebeat in the clients and configure them so send data directly to eleasticsearch on my server, it works fine, but I had some problems with the integration with icinga, I only can see the kernel logs, not all system logs, I checked the data on elastic and it was there.
just to try to make my life easier I installed kibana, and the dashboard shows all the data I wanted without problems.
just to test I installed logstash on my server and reconfigured the filebeat on the client to sent the data to logstash, and voila! the integration with icinga works perfect and show all the system logs, but now the kibana dashboard doesn't show any data.
I tried that logstash send the data to filebeat index, but the result are the same. and I really like to make it work properly.
so... long history short:
- if I use filebeat directly with eleasticsearch, kibana works perfectly.
- if I use filebeat with logstash, icinga works perfectly but kibana dashboard doesn't work.
In any case I can see all the data in the kibana's discover panel
my logstash config.
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash.crt"
ssl_key => "/etc/pki/tls/private/logstash.key"
}
}
#############################################################
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
}
}
#############################################################
output {
elasticsearch {
hosts => ["http://MY_ELASTICSEARCH_IP:9200"]
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
my elasticsearch config:
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
bootstrap.memory_lock: true
network.host: MY_ELASTICSEARCH_IP
http.port: 9200
the filebeat client's config.
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/*.log
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: true
reload.period: 10s
setup.template.settings:
index.number_of_shards: 3
setup.dashboards.enabled: true
setup.kibana:
host: "MY_KIBANA_IP:5601"
output.logstash:
hosts: ["MY_LOGSTASH_IP:5044"]
bulk_max_size: 1024
index: filebeat
certificate_authorities: ["/etc/pki/tls/certs/logstash.crt"]
ssl.certificate: "/etc/pki/tls/certs/filebeats.crt"
ssl.key: "/etc/pki/tls/private/filebeats.key"
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
any idea how to make the kibana dashboard works with the data from logstash?
thank you very much.
an update:
I deleted all the index-patterns and deleted all elastic data, I ran the filebeat (it popupates the dashboard and create the index-patterns)
then I check the index-patterns and I have 2 filebeat-* indexes (exactly the same name), in the discovery only one has data, I deleted the empty one and when I go to the dashboards I get the message.
"Could not locate that index-pattern (id: filebeat-*), [click here to re-create it](#/management/kibana/index)"
so, how is this possible? kibana dashobard is looking for and index with the exact same name but it's different?