I have 2 droplets configured in my DO project:
- ELK Stack
- Magento eCommerce platform on a LAMP Stack
Both instances are up and running, with all services setup.
However, I am having an issue with my Magento server, which is configured as the Filebeat client, sending its system logs over to the ELK stack.
I've followed the instructions from this KB article: https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-ubuntu-14-04
I'm stuck at the "Test Filebeat Installation" section and no matter what I'm trying with the Filebeat configuration the logs do not seem to be getting sent to the ELK stack. I've gone back through the configurations a few time to ensure I have it setup properly, but it's possible I'm overlooking something simple.
When I search for filebeat logs on the ELK stack I get 0 hits (see below extract from my terminal). Can anyone provide any guidance/suggestions on where to troubleshoot?
curl -XGET 'http://localhost:9200/filebeat-*/_search?pretty'
{
"took" : 1,
"timed_out" : false,
"_shards" : {
"total" : 0,
"successful" : 0,
"failed" : 0
},
"hits" : {
"total" : 0,
"max_score" : 0.0,
"hits" : [ ]
}
}
Here's the various configs I have setup.
filebeat.yml
filebeat:
prospectors:
paths:
- /var/log/*.log
#- c:\programdata\elasticsearch\logs\*
# - /var/www/html/magento/var/log/syslog
# - /var/www/html/magento/var/log/auth.log
- /var/www/html/magento/var/log/connector.log
- /var/www/html/magento/var/log/debug.log
- /var/www/html/magento/var/log/exception.log
- /var/www/html/magento/var/log/install.log
- /var/www/html/magento/var/log/support_report.log
- /var/www/html/magento/var/log/system.log
input_type: log
document_type: syslog
registry_file: /var/lib/filebeat/registry
output:
### Logstash as output
logstash:
# The Logstash hosts
hosts: ["209.97.151.135:5044"]
bulk_max_size: 1024
# Optional TLS. By default is off.
tls:
# List of root certificates for HTTPS server verifications
certificate_authorities: ["/etc/pki/tls/certs/logstash-forwarder.crt"]
shipper:
logging:
files:
rotateeverybytes: 10485760 # = 10MB
30-elasticsearch-output.conf
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
10-syslog-filter.conf
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
02-beats-input.conf
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
Please let me know if I can provide any additional information or details.
Thanks so much,