Filebeat not able to send log to different cloud


(V Sai Ram) #1

Hi Team,

I have added a new node and installed the filebeat. It needs to send the logs from the node to the logstash server installed another cloud. Am not able to see the logs in the Kibana GUI.

Here is the straight forward filebeat.yml in the new node :

filebeat:
prospectors:
-
paths:
- /ephemeral/logs/job-server/*.log
input_type: log 
document_type: log
scan_frequency: 10s
registry_file: /var/lib/filebeat/registry
output:
logstash:
hosts: ["10.157.6.241:5044"]
bulk_max_size: 1024
shipper:
logging:
files:
rotateeverybytes: 10485760

With this filebeat.yml in place, I am pasting the below debug logs for reference :

2016/09/19 11:29:57.083112 client.go:100: DBG connect
2016/09/19 11:29:57.083925 client.go:146: DBG Try to publish 1024 events to logstash with window size 5
2016/09/19 11:29:57.086134 client.go:105: DBG close connection
2016/09/19 11:29:57.086208 client.go:124: DBG 0 events out of 1024 events sent to logstash. Continue sending ...
2016/09/19 11:29:57.086225 single.go:77: INFO Error publishing events (retrying): EOF
2016/09/19 11:29:57.086240 single.go:154: INFO send fail
2016/09/19 11:29:57.086249 single.go:161: INFO backoff retry: 2s
2016/09/19 11:29:59.086804 client.go:100: DBG connect
2016/09/19 11:29:59.087764 client.go:146: DBG Try to publish 1024 events to logstash with window size 2
2016/09/19 11:29:59.093366 client.go:105: DBG close connection
2016/09/19 11:29:59.093555 client.go:124: DBG 0 events out of 1024 events sent to logstash. Continue sending ...
2016/09/19 11:29:59.093905 single.go:77: INFO Error publishing events (retrying): EOF
2016/09/19 11:29:59.093916 single.go:154: INFO send fail
2016/09/19 11:29:59.093927 single.go:161: INFO backoff retry: 4s
2016/09/19 11:30:03.094124 client.go:100: DBG connect
2016/09/19 11:30:03.094933 client.go:146: DBG Try to publish 1024 events to logstash with window size 1
2016/09/19 11:30:03.097109 client.go:105: DBG close connection
2016/09/19 11:30:03.097158 client.go:124: DBG 0 events out of 1024 events sent to logstash. Continue sending ...
2016/09/19 11:30:03.097172 single.go:77: INFO Error publishing events (retrying): EOF
2016/09/19 11:30:03.097179 single.go:154: INFO send fail
2016/09/19 11:30:03.097187 single.go:161: INFO backoff retry: 8s
2016/09/19 11:30:05.257910 prospector.go:185: DBG Start next scan
2016/09/19 11:30:05.257963 prospector.go:261: DBG scan path /ephemeral/log/job-server/*.log
2016/09/19 11:30:05.258303 prospector.go:275: DBG Check file for harvesting: /ephemeral/log/job-server/spark-job-server.log

Telnet 10.157.6.241 5044 is able to connect. telnet is working.

:Let me know if anything else is required from side. Please help me out here.


(Steffen Siering) #2

can you add logstash input configs?

Please format configuration files + logs using the </> button.


(V Sai Ram) #3
[root@coord-1 conf.d]# cat beats-input.conf 
      input {
            beats {
           ` port => 5044`
            }
        }
[root@coord-1 conf.d]#

(Steffen Siering) #4

have you tested telnet from same machine as filebeat?

have you a more complete filebeat log?

Is filebeat ever sending any events? Like does EOF occur only from time to time?

Does filebeat config as shown in your post match your config file? The indentation seems to be totally off.

Anything in logstash logs about connections be closed or circuit breaker?


(V Sai Ram) #5
  1. Yes. I have tested on the same machine as filebeat
    [root@spark-analytics-1 ~]# telnet 10.157.6.241 5044
    Trying 10.157.6.241...
    Connected to 10.157.6.241.
    Escape character is '^]'.

  2. Logs are given in below :
    2016/09/21 06:16:45.822610 client.go:146: DBG Try to publish 1024 events to logstash with window size 2
    2016/09/21 06:16:54.954581 prospector.go:185: DBG Start next scan
    2016/09/21 06:16:54.954641 prospector.go:261: DBG scan path /ephemeral/log/job-server/.log
    2016/09/21 06:16:54.954958 prospector.go:275: DBG Check file for harvesting: /ephemeral/log/job-server/spark-job-server.log
    2016/09/21 06:16:54.954991 registrar.go:175: DBG Same file as before found. Fetch the state.
    2016/09/21 06:16:54.955022 prospector.go:418: DBG Update existing file for harvesting: /ephemeral/log/job-server/spark-job-server.log
    2016/09/21 06:16:54.955080 prospector.go:465: DBG Not harvesting, file didn't change: /ephemeral/log/job-server/spark-job-server.log
    2016/09/21 06:17:04.955308 prospector.go:185: DBG Start next scan
    2016/09/21 06:17:04.955384 prospector.go:261: DBG scan path /ephemeral/log/job-server/
    .log
    2016/09/21 06:17:04.955740 prospector.go:275: DBG Check file for harvesting: /ephemeral/log/job-server/spark-job-server.log
    2016/09/21 06:17:04.955774 registrar.go:175: DBG Same file as before found. Fetch the state.
    2016/09/21 06:17:04.955801 prospector.go:418: DBG Update existing file for harvesting: /ephemeral/log/job-server/spark-job-server.log
    2016/09/21 06:17:04.955818 prospector.go:465: DBG Not harvesting, file didn't change: /ephemeral/log/job-server/spark-job-server.log
    2016/09/21 06:17:14.956026 prospector.go:185: DBG Start next scan
    2016/09/21 06:17:14.956108 prospector.go:261: DBG scan path /ephemeral/log/job-server/*.log
    2016/09/21 06:17:14.956439 prospector.go:275: DBG Check file for harvesting: /ephemeral/log/job-server/spark-job-server.log
    2016/09/21 06:17:14.956467 registrar.go:175: DBG Same file as before found. Fetch the state.
    2016/09/21 06:17:14.956481 prospector.go:418: DBG Update existing file for harvesting: /ephemeral/log/job-server/spark-job-server.log
    2016/09/21 06:17:14.956491 prospector.go:465: DBG Not harvesting, file didn't change: /ephemeral/log/job-server/spark-job-server.log
    2016/09/21 06:17:15.824393 client.go:105: DBG close connection
    2016/09/21 06:17:15.824508 client.go:124: DBG 0 events out of 1024 events sent to logstash. Continue sending ...
    2016/09/21 06:17:15.824558 single.go:77: INFO Error publishing events (retrying): read tcp 10.254.73.54:35622->10.157.6.241:5044: i/o timeout
    2016/09/21 06:17:15.824576 single.go:154: INFO send fail
    2016/09/21 06:17:15.824586 single.go:161: INFO backoff retry: 1s
    2016/09/21 06:17:16.824912 client.go:100: DBG connect
    2016/09/21 06:17:16.825800 client.go:146: DBG Try to publish 1024 events to logstash with window size 1


(V Sai Ram) #6

Continuing the other questions -->

  1. Filebeat config is correct I guess since am able to start the filebeat service

  2. logstash logs ( /var/log/logstash in the logstash server, am not seeing any logs in logstash)


(ruflin) #7

Could you run Logstash with the --debug flag to get more verbose logging? https://www.elastic.co/guide/en/logstash/current/command-line-flags.html


(V Sai Ram) #8
[root@coord-1 ~]# systemctl status logstash.service 
● logstash.service - LSB: Starts Logstash as a daemon.
   Loaded: loaded (/etc/rc.d/init.d/logstash)
   Active: active (running) since Wed 2016-09-21 10:38:37 EEST; 6s ago
     Docs: man:systemd-sysv-generator(8)
  Process: 8392 ExecStop=/etc/rc.d/init.d/logstash stop (code=exited, status=0/SUCCESS)
  Process: 8414 ExecStart=/etc/rc.d/init.d/logstash start (code=exited, status=0/SUCCESS)
   CGroup: /system.slice/logstash.service
           └─8421 /bin/java -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -Djava.awt.headless=true -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutOfMemoryError -Dj...

Am running logstash as a service. How to provide the options --debug etc. I am not seeing any /usr/bin/logstash executable


(ruflin) #9

Could you check under /usr/share/logstash/bin/ ?


(V Sai Ram) #10
[root@coord-1 bin]# ./logstash -f /etc/logstash/ --debug
Reading config file {:file=>"logstash/agent.rb", :level=>:debug, :line=>"370", :method=>"local_config"}
Reading config file {:file=>"logstash/agent.rb", :level=>:debug, :line=>"370", :method=>"local_config"}
Reading config file {:file=>"logstash/agent.rb", :level=>:debug, :line=>"370", :method=>"local_config"}
Reading config file {:file=>"logstash/agent.rb", :level=>:debug, :line=>"370", :method=>"local_config"}
Plugin not defined in namespace, checking for plugin file {:type=>"input", :name=>"beats", :path=>"logstash/inputs/beats", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
Plugin not defined in namespace, checking for plugin file {:type=>"codec", :name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
config LogStash::Codecs::Plain/@charset = "UTF-8" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@port = 5044 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@ssl = true {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@ssl_certificate = "/etc/pki/tls/certs/logstash-forwarder.crt" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@ssl_key = "/etc/pki/tls/private/logstash-forwarder.key" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@codec = <LogStash::Codecs::Plain charset=>"UTF-8"> {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@add_field = {} {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@host = "0.0.0.0" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@ssl_certificate_authorities = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@ssl_verify_mode = "none" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@congestion_threshold = 5 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Inputs::Beats/@target_field_for_codec = "message" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
Plugin not defined in namespace, checking for plugin file {:type=>"filter", :name=>"grok", :path=>"logstash/filters/grok", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
config LogStash::Filters::Grok/@match = {"message"=>"%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\\[%{POSINT:syslog_pid}\\])?: %{GREEDYDATA:syslog_message}"} {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Filters::Grok/@add_field = {"received_at"=>"%{@timestamp}", "received_from"=>"%{host}"} {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Filters::Grok/@add_tag = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Filters::Grok/@remove_tag = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Filters::Grok/@remove_field = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Filters::Grok/@periodic_flush = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Filters::Grok/@patterns_dir = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Filters::Grok/@patterns_files_glob = "*" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Filters::Grok/@break_on_match = true {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Filters::Grok/@named_captures_only = true {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
config LogStash::Filters::Grok/@keep_empty_captures = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}

(ruflin) #11

Do you have the beats input plugin in logstash installed? What logstash version are you using?


(Jerson Luiz de Paula Júnior) #12

I have the same error:

Elasticsearch 5.0.0beta-1
logstash 5 beta1
filebeat 5.beta1

016-09-23T21:26:18Z ERR Connecting error publishing events (retrying): read tcp 192.168.1.102:42498->192.168.1.51:5044: i/o timeout


(Steffen Siering) #13

@vsairam can you provide logstash logs after error happened? These are basically startup logs.

@jersonjunior can you please start another discussion. Just from seeing a timeout error I can not tell if your problem is really the same or another. We will update this topic with solution in case we find it's the problem + we managed to solve it in one or the other discussion.


(V Sai Ram) #14

Hi, I moved the new instance node within same cluster to get it working
currently. But I would definitely retry the setup and try to collect logs
as you have requested. Please give me couple days. Shall get back to you
with logs.

Thanks a lot.


(system) #15

This topic was automatically closed after 21 days. New replies are no longer allowed.