Hi everyone,
Im using Logstash on aws ecs (Docker), and using filebeat 5.6.1 to transfer logs to logstash, and output them to graylog, no data goes into graylog, and getting the following from filebeat log and logstash log:
filebeat log looks like that:
2017-09-25T17:01:07+03:00 INFO No non-zero metrics in the last 30s
2017-09-25T17:01:14+03:00 ERR Failed to publish events caused by: EOF
2017-09-25T17:01:14+03:00 INFO Error publishing events (retrying): EOF
2017-09-25T17:01:37+03:00 INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_errors=1 libbeat.logstash.publish.write_bytes=294 libbeat.logstash.published_but_not_acked_events=1918
2017-09-25T17:02:07+03:00 INFO No non-zero metrics in the last 30s
2017-09-25T17:02:14+03:00 ERR Failed to publish events caused by: EOF
2017-09-25T17:02:14+03:00 INFO Error publishing events (retrying): EOF
2017-09-25T17:02:37+03:00 INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_errors=1 libbeat.logstash.publish.write_bytes=294 libbeat.logstash.published_but_not_acked_events=1918
2017-09-25T17:03:07+03:00 INFO No non-zero metrics in the last 30s
2017-09-25T17:03:14+03:00 ERR Failed to publish events caused by: EOF
2017-09-25T17:03:14+03:00 INFO Error publishing events (retrying): EOF
2017-09-25T17:03:37+03:00 INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_errors=1 libbeat.logstash.publish.write_bytes=294 libbeat.logstash.published_but_not_acked_events=1918
2017-09-25T17:04:07+03:00 INFO No non-zero metrics in the last 30s
2017-09-25T17:04:14+03:00 ERR Failed to publish events caused by: EOF
2017-09-25T17:04:14+03:00 INFO Error publishing events (retrying): EOF
2017-09-25T17:04:37+03:00 INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_errors=1 libbeat.logstash.publish.write_bytes=294 libbeat.logstash.published_but_not_acked_events=1918
logstash log:
Sending Logstash's logs to /usr/share/logstash/logs which is now configured via log4j2.properties
13:50:53.756 [main] INFO logstash.modules.scaffold - Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
13:50:53
13:50:53.759 [main] INFO logstash.modules.scaffold - Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
13:50:53.782 [main] INFO logstash.modules.scaffold - Initializing module {:module_name=>"arcsight", :directory=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/x-pack-5.6.1-java/modules/arcsight/configuration"}
13:50:53.796 [main] INFO logstash.setting.writabledirectory - Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
13:50:53.797 [main] INFO logstash.setting.writabledirectory - Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
13:50:53.842 [LogStash::Runner] INFO logstash.agent - No persistent UUID file found. Generating new UUID {:uuid=>"45f3944a-414a-44a5-8be8-b163916316d5", :path=>"/usr/share/logstash/data/uuid"}
13:50:54.161 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250}
13:50:54.699 [[main]-pipeline-manager] INFO logstash.inputs.beats - Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
13:50:54.743 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
13:50:54.772 [[main]<beats] INFO org.logstash.beats.Server - Starting server on port: 5044
13:50:54.821 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
13:50:57.004 [Ruby-0-Thread-15: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/puma-2.16.0-java/lib/puma/thread_pool.rb:61] DEBUG logstash.agent - API HTTP Request {:status=>200, :request_method=>"PUT", :path_info=>"/_node/logging", :query_string=>"pretty", :http_version=>"HTTP/1.1", :http_accept=>"/"}
13:50:58.875 [pool-2-thread-1] DEBUG logstash.instrument.periodicpoller.cgroup - Error, cannot retrieve cgroups information {:exception=>"Errno::ENOENT", :message=>"No such file or directory - /sys/fs/cgroup/cpuacct/docker/8a0c0d352bdf7602dd960b1bb936cc2532db0c4155442f5df158473de02b413c/cpuacct.usage"}
13:50:59.756 [Ruby-0-Thread-8: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:561] DEBUG logstash.pipeline - Pushing flush onto pipeline
13:51:03.880 [pool-2-thread-2] DEBUG logstash.instrument.periodicpoller.cgroup - Error, cannot retrieve cgroups information {:exception=>"Errno::ENOENT", :message=>"No such file or directory - /sys/fs/cgroup/cpuacct/docker/8a0c0d352bdf7602dd960b1bb936cc2532db0c4155442f5df158473de02b413c/cpuacct.usage"}
13:51:04.757 [Ruby-0-Thread-8: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:561] DEBUG logstash.pipeline - Pushing flush onto pipeline
13:51:08.883 [pool-2-thread-1] DEBUG logstash.instrument.periodicpoller.cgroup - Error, cannot retrieve cgroups information {:exception=>"Errno::ENOENT", :message=>"No such file or directory - /sys/fs/cgroup/cpuacct/docker/8a0c0d352bdf7602dd960b1bb936cc2532db0c4155442f5df158473de02b413c/cpuacct.usage"}
13:51:09.757 [Ruby-0-Thread-8: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:561] DEBUG logstash.pipeline - Pushing flush onto pipeline
pipeline file logstash.conf:
input { beats { port => 5044 } }
output { gelf { host => "graylog.mydomain.com" } }
logstash config file logstash.yml:
http.host: "0.0.0.0"
path.config: /usr/share/logstash/pipeline
xpack.monitoring.enabled: false
filebeat.yml
filebeat:
prospectors:
-
paths:
- /home/tomcat/WEB-INF/logs/tomcat.log
input_type: log
output:
logstash:
hosts: ["mylogstash-lb.us-west-2.elb.amazonaws.com:5044"]
Dockerfile:
FROM docker.elastic.co/logstash/logstash:5.6.1
RUN rm -f /usr/share/logstash/pipeline/logstash.conf
ADD pipeline/ /usr/share/logstash/pipeline/
ADD config/ /usr/share/logstash/config/
RUN /usr/share/logstash/bin/logstash-plugin install logstash-input-beats
RUN /usr/share/logstash/bin/logstash-plugin install logstash-output-gelf
RUN /usr/share/logstash/bin/logstash-plugin install logstash-filter-grok
Thanks for your help.