I have configured the following on a Windows 10 machine running on VMWare:
Filebeat->Logstash->Elasticsearch
All of these are running on the same machine. The problem is that Logstash is not sending an ACK to Filebeat when it receives an event for processing but it is processing events. As a result, Filebeat eventually times out waiting for an ACK and then resends the event.
2020-02-10T17:56:44.406-0500 ERROR logstash/async.go:256 Failed to publish events caused by: read tcp [::1]:4435->[::1]:5044: i/o timeout
Logstash processes the event again but does not send an ACK. This process continues and duplicate events pile up in Elasticsearch. Also, no errors are displayed in the Logstash file.
This loop occurs with only one event so there are no bandwidth issues.
Here are my filebeat.yml and logstash.conf files. Any assistance you could provide would be greatly appreciated.
Filebeat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- c:\Logs\WebFocus\*.txt
#============================= Filebeat modules ===============================
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
# Set to true to enable config reloading
reload.enabled: false
#==================== Elasticsearch template setting ==========================
setup.template.settings:
index.number_of_shards: 1
#============================== Kibana =====================================
# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:
# Kibana Host
hosts: "w10perftest018:5601"
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["w10perftest018:5044"]
#================================ Processors =====================================
# Configure processors to enhance or manipulate events generated by the beat.
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~
- drop_fields.fields: ["beat.name"]
logstash.conf
input {
beats {
port => 5044
}
}
filter {
mutate {
rename => ["host", "server"]
}
grok {
match => {"message"=> "%{IP:source.ip} - - [%{HTTPDATE:timestamp}] "%{PROG:http.request.method} %{PATH:url.original} HTTP/%{NUMBER:http.version}" %{INT:http.response.status_code} %{INT:http.response.body.bytes} %{INT:event.duration}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss ZZ" ]
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "perfdata"
}
stdout { codec => rubydebug }
}