Hello, I'm new with Elastic Stack and currently trying to build logging system with infrastructure type filebeat -> logstash -> elastic search -> kibana.
I'm using the separate AWS EC2 instance for logstash, elastic search, kibana and filebeat is located on my web server instance, I want to send logs from my web server instance via filebeat to the logstash.
On my instance with kibana I've installed logstash, elastic search and kibana, launched kibana web interface with nginx proxy. Next, I ran into a problem that filebeat cannot send logs to logstash.
My logstash configuration file is pretty simple /etc/logstash/conf.d/billing.conf:
input {
beats {
port => "5044"
}
}
filter {
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}
All required ports are enabled and available
rpcbind 2743 rpc 8u IPv4 16613 0t0 TCP *:111 (LISTEN)
rpcbind 2743 rpc 11u IPv6 16616 0t0 TCP *:111 (LISTEN)
ruby 2775 healthd 6u IPv4 17290 0t0 TCP 127.0.0.1:22221 (LISTEN)
node 2777 kibana 18u IPv4 26074 0t0 TCP 127.0.0.1:5601 (LISTEN)
master 3242 root 13u IPv4 19900 0t0 TCP 127.0.0.1:25 (LISTEN)
java 3291 elasticsearch 253u IPv6 22037 0t0 TCP 127.0.0.1:9300 (LISTEN)
java 3291 elasticsearch 267u IPv6 22925 0t0 TCP 127.0.0.1:9200 (LISTEN)
sshd 3442 root 3u IPv4 20965 0t0 TCP *:22 (LISTEN)
sshd 3442 root 4u IPv6 20967 0t0 TCP *:22 (LISTEN)
java 4082 webapp 5u IPv6 25824 0t0 TCP *:5000 (LISTEN)
java 5999 logstash 95u IPv6 39027 0t0 TCP *:5044 (LISTEN)
java 5999 logstash 98u IPv6 39035 0t0 TCP 127.0.0.1:9600 (LISTEN)
nginx 6652 root 6u IPv4 44256 0t0 TCP *:80 (LISTEN)
nginx 6653 nginx 6u IPv4 44256 0t0 TCP *:80 (LISTEN)
nginx 6655 nginx 6u IPv4 44256 0t0 TCP *:80 (LISTEN)
Security groups are configured properly to allowed connection 5044 port
On my web server instance I have installed filebeat and configured it like:
name: "billing"
filebeat.modules:
- module: apache2
access:
enabled: true
error:
enabled: true
filebeat.inputs:
- type: log
enabled: true
tags: ["billing-requests"]
json.keys_under_root: true
fields_under_root: false
paths:
- /var/app/current/logs/requests/*.log
- type: log
enabled: true
tags: ["eb-activity"]
multiline.pattern: '^\['
multiline.negate: true
multiline.match: after
paths:
- /var/log/eb-activity.log
- type: log
enabled: true
tags: ["php-cli-errors"]
paths:
- /var/log/php_errors.log
output.logstash:
hosts: [52.212.142.105:5044"]
processors:
- add_host_metadata: ~`
The above configuration is mentioned in all the documentation that I could find, everything looks simple and I spent a lot of time trying different options, but nothing worked.
In the filebeat logs I can see following errors:
2020-07-14T12:09:38.530Z ERROR [publisher_pipeline_output] pipeline/output.go:155 Failed to connect to backoff(async(tcp://52.212.142.105:5044")): dial tcp: address tcp/5044": unknown port
2020-07-14T12:09:38.530Z INFO [publisher_pipeline_output] pipeline/output.go:146 Attempting to reconnect to backoff(async(tcp://52.212.142.105:5044")) with 6 reconnect attempt(s)
2020-07-14T12:09:38.531Z INFO [publisher] pipeline/retry.go:221 retryer: send unwait signal to consumer
2020-07-14T12:09:38.531Z INFO [publisher] pipeline/retry.go:225 done
2020-07-14T12:09:39.808Z INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":70},"total":{"ticks":440,"time":{"ms":6},"value":440},"user":{"ticks":370,"time":{"ms":6}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":13},"info":{"ephemeral_id":"c7b3ab10-e062-4ff5-ba41-c14c0a79cc6d","uptime":{"ms":90047}},"memstats":{"gc_next":51647968,"memory_alloc":26992816,"memory_total":95533872},"runtime":{"goroutines":67}},"filebeat":{"harvester":{"files":{"b411598c-486c-472e-9712-3750b5997caf":{"size":663}},"open_files":5,"running":5}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":5,"events":{"active":4120,"retry":2048}}},"registrar":{"states":{"current":8}},"system":{"load":{"1":0.14,"15":0.03,"5":0.08,"norm":{"1":0.07,"15":0.015,"5":0.04}}}}}}
2020-07-14T12:10:09.808Z INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":70,"time":{"ms":1}},"total":{"ticks":440,"time":{"ms":4},"value":440},"user":{"ticks":370,"time":{"ms":3}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":13},"info":{"ephemeral_id":"c7b3ab10-e062-4ff5-ba41-c14c0a79cc6d","uptime":{"ms":120047}},"memstats":{"gc_next":51647968,"memory_alloc":27304320,"memory_total":95845376},"runtime":{"goroutines":67}},"filebeat":{"harvester":{"files":{"b411598c-486c-472e-9712-3750b5997caf":{"size":663}},"open_files":5,"running":5}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":5,"events":{"active":4120}}},"registrar":{"states":{"current":8}},"system":{"load":{"1":0.08,"15":0.03,"5":0.08,"norm":{"1":0.04,"15":0.015,"5":0.04}}}}}}