After reinstall all components, now the logs are written into ES by logstash, but the error message still exists. Not sure if it is safe to ignore it.
----------------- EDIT -----------------------
Hi,
I am running version 5.0 of filebeat, logstash and Elasticsearch on one server. When I config filebeat output to ES, it works, but after I change output to logstash, it stops working.
The os is Ubuntu 14.04.3 LTS,logs and configs as following:
/var/log/logstash/logstash-plain.log
[2016-11-01T14:53:02,685][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2016-11-01T14:53:02,812][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2016-11-01T14:53:03,025][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>["http://192.168.199.9:9200"]}}
[2016-11-01T14:53:03,026][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["192.168.199.9:9200"]}
[2016-11-01T14:53:03,028][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2016-11-01T14:53:03,039][INFO ][logstash.pipeline ] Pipeline main started
[2016-11-01T14:53:03,067][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
/var/log/filebeat/filebeat
2016-11-01T14:56:31+08:00 INFO Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
2016-11-01T14:56:31+08:00 INFO Setup Beat: filebeat; Version: 5.0.0
2016-11-01T14:56:31+08:00 INFO Max Retries set to: 3
2016-11-01T14:56:31+08:00 INFO Activated logstash as output plugin.
2016-11-01T14:56:31+08:00 INFO Publisher name: fd-dev
2016-11-01T14:56:31+08:00 INFO Flush Interval set to: 1s
2016-11-01T14:56:31+08:00 INFO Max Bulk Size set to: 2048
2016-11-01T14:56:31+08:00 INFO filebeat start running.
2016-11-01T14:56:31+08:00 INFO Registry file set to: /var/lib/filebeat/registry
2016-11-01T14:56:31+08:00 INFO Loading registrar data from /var/lib/filebeat/registry
2016-11-01T14:56:31+08:00 INFO States Loaded from registrar: 5
2016-11-01T14:56:31+08:00 INFO Loading Prospectors: 1
2016-11-01T14:56:31+08:00 INFO Start sending events to output
2016-11-01T14:56:31+08:00 INFO Load previous states from registry into memory
2016-11-01T14:56:31+08:00 INFO Starting Registrar
2016-11-01T14:56:31+08:00 INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2016-11-01T14:56:31+08:00 INFO Previous states loaded: 5
2016-11-01T14:56:31+08:00 INFO Loading Prospectors completed. Number of prospectors: 1
2016-11-01T14:56:31+08:00 INFO All prospectors are initialised and running with 5 states to persist
2016-11-01T14:56:31+08:00 INFO Starting prospector of type: log
2016-11-01T14:56:31+08:00 INFO Harvester started for file: /fd/gameservice2/logs/gs.log
2016-11-01T14:57:01+08:00 INFO Non-zero metrics in the last 30s: libbeat.publisher.published_events=208 filebeat.harvester.running=1 publish.events=214 libbeat.logstash.published_and_acked_events=208 registrar.writes=3 libbeat.logstash.publish.read_bytes=48 libbeat.logstash.publish.write_bytes=16003 filebeat.harvester.open_files=1 registrar.states.update=214 libbeat.logstash.call_count.PublishEvents=3 registar.states.current=5 filebeat.harvester.started=1
2016-11-01T14:57:31+08:00 INFO Non-zero metrics in the last 30s: libbeat.publisher.published_events=229 libbeat.logstash.publish.read_bytes=24 registrar.writes=3 publish.events=229 registrar.states.update=229 libbeat.logstash.call_count.PublishEvents=3 libbeat.logstash.publish.write_bytes=9747 libbeat.logstash.published_and_acked_events=229
2016-11-01T14:57:51+08:00 ERR Failed to publish events caused by: EOF
2016-11-01T14:57:51+08:00 INFO Error publishing events (retrying): EOF
/etc/filebeat/filebeat.yml
filebeat.prospectors:
- input_type: log
paths:
- /fd/gameservice2/logs/gs.log*
output.logstash:
hosts: ["localhost:5044"]
/etc/logstash/conf.d/beats.conf
input {
beats {
port => 5044
}
}
output {
elasticsearch {
hosts => "192.168.199.9:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
Did I miss anything? Thanks in advance.