(both were closed)
Followed the process for it to be solved, my logstash was attaching to tcp6 according to netstat, but even if I set the -Djava.net.preferIPv4Stack=true and now attaches properly: netstat -plnt ... tcp 0 0 127.0.0.1:9600 0.0.0.0:* LISTEN 27215/java tcp 0 0 0.0.0.0:5043 0.0.0.0:* LISTEN 27912/java
still getting the same error from filebeat (nothing in logstash logs)
What else can look into?
My config (same as getting started guide)
first-pipeline.conf input { beats { port => "5043" host => "0.0.0.0" } } output { stdout { codec => rubydebug } }
filebeat.yml:
`filebeat.prospectors:
input_type: log
paths:
/home/centos/elkdata/logstash-tutorial.log
output.logstash:
hosts: ["localhost:5043"]
`
filebeat and logstash are both version 5.3.0
My environment: Distributor ID: CentOS Description: CentOS Linux release 7.1.1503 (Core) Release: 7.1.1503 Codename: Core
HI @ruflin, thanks for your help.
My setup is exactly as the getting started guide is (I'm a newbie with filebeat), the only change was that localhost entry in the first-pipeline.conf because that was suggested in the first post I mentioned...
without it the problem will still appear... what else can I give you from my setup?
The connection has been established, but is being closed by Logstash. This is basically what the error message indicates.
Logstash 5.3.0 ships with logstash-input-beats 3.1.12. Upgrading Logstash to a new version (or just the plugin) might help. You can also try to increase the client_inactivity_timeout to a very very large number.
From Changelog, I would recommend at least logstash-input-beats version 3.1.14.
Checking Logstash 5.3.2 release, it ships with plugin version 3.1.15.
Documentation says the number is in seconds, so I guess this is very very large...
start log is:
[centos@ip-10-11-80-98 logstash-5.3.2]$ bin/logstash -f first-pipeline.conf --config.reload.automatic
Sending Logstash's logs to /opt/logstash-5.3.2/logs which is now configured via log4j2.properties
[2017-05-10T12:25:58,946][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
log4j:WARN No appenders could be found for logger (io.netty.util.internal.logging.InternalLoggerFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
[2017-05-10T12:25:59,339][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5043"}
[2017-05-10T12:25:59,387][INFO ][logstash.pipeline ] Pipeline main started
[2017-05-10T12:25:59,468][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Can you try with Filebeat and Logstash debug enabled and share the complete logs?
The timestamps of connection resets do perfectly match the exponential backoff in beats when connection is lost. But from snippet you posted I can not tell when filebeat did try to connect. It looks like the connection is either closed immediately or not even fully established?
Have you checked with netstat (I think CentOS is using ss) for any kind of active connection?
Thanks so much for your help, now it's working.
root cause: The interface between the chair and the screen !
Explanation: I went on and downloaded the latest version (5.4.0) and it worked immediately... curious to know why I diff the config files with the previous version, and ooops, filebeat.yml 5043 port config was pointing to elasticsearch (port 5043)!
I did put it as the guide (and as posted here) but since in 5.3.0 permissions were wrong for filebeat user
(issue Filebeat.yml must be owned by the beat user (uid=0) or root)
when I fixed it I had the file already opened in another sublimeText window, so It got overwritten silently later and I guess I missed somehow the sublime notification of this refresh...
Now in 5.4.0 permissions are ok out of the tar.gz , and I have the latest version, yay!
apologies for the time wasted, thanks a lot for your support!
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.