Hi team,
Please help urgently as we are doing Production deployment this week. I am getting errors in filebeat logs and data is not read and sent to logstash ...my data flow is filebeat->logstash->elasticsearch
ERR Failed to publish events caused by: write tcp [::1]:60383->[::1]:5044: wsasend: An established connection was aborted by the software in your host machine.
Here is my filebeat conf -
filebeat.prospectors:
Each - is a prospector. Most options can be set at the prospector level, so
you can use different prospectors for various configurations.
Below are the prospector specific configurations.
- input_type: log
# Paths that should be crawled and fetched. Glob based paths. paths: - D:/Debashree/TechOffice/Software/Logs/test*.logs document_type: testing fields: server: localhost ignore_older: 10m harvester_buffer_size: 16384
Logstash conf file
input {
beats {
port => 5044
}
}
output{
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "filebeattest"
}}
So when multiple logfiles are updated below is what I see -
2017/06/15 19:33:16.834650 sync.go:85: ERR Failed to publish events caused by: read tcp 127.0.0.1:61443->127.0.0.1:5044: wsarecv: An established connection was aborted by the software in your host machine.
2017/06/15 19:33:16.835653 single.go:91: INFO Error publishing events (retrying): read tcp 127.0.0.1:61443->127.0.0.1:5044: wsarecv: An established connection was aborted by the software in your host machine.
2017/06/15 19:33:36.565050 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=2 libbeat.logstash.publish.read_bytes=6 libbeat.logstash.publish.read_errors=1 libbeat.logstash.publish.write_bytes=490 libbeat.logstash.published_and_acked_events=1 libbeat.logstash.published_but_not_acked_events=1 libbeat.publisher.published_events=1 publish.events=1 registrar.states.update=1 registrar.writes=1
What is I am missing here ?