I have installed a 7.17.3 metric beat and file beat, both the beats are unable to send data to the logstash

Hi Team,

I have a 3 node elk cluster 7.17.3 , i have installed metricbeats and file beat on a new server , the logstash ports are opened(5044). i have checked telnet. the connection looks fine.

The beats are unable to send the data to logstash. this is the info i got from the logs

INFO [publisher_pipeline_output] pipeline/output.go:143 Connecting to backoff(async(tcp://1xxx...:5044))
2023-10-30T09:44:37.607Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2023-10-30T09:44:37.607Z INFO [publisher] pipeline/retry.go:223 done
2023-10-30T09:44:37.610Z INFO [publisher_pipeline_output] pipeline/output.go:151 Connection to backoff(async(tcp://xxxxx5044)) established
2023-10-30T09:44:37.614Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2023-10-30T09:44:37.614Z INFO [publisher] pipeline/retry.go:223 done
2023-10-30T09:44:37.614Z ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: read tcp 1xxxx3:60612->1xxxx:5044: wsarecv: An existing connection was forcibly closed by the remote host.

Is there any issue with my logstash .. do i need to check anything in my logstash server

can some one look in to this issue

Are you using HTTPS connection?

No, We are not using the https . currently we are trying to connect via http

We are not using https.

Can you show part from filebeat.yml and input from LS?
If there is an active firewall on LS, open TCP port 5044

# ============================== Filebeat inputs ===============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

# filestream is an input for collecting log messages from files.
- type: filestream

  # Change to true to enable this input configuration.
  enabled: false

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
    - D:\xxx\Scripts\xxx\rxxx_log.txt

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  #exclude_lines: ['^DBG']

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  include_lines: ['.*xxxxxx stopped(.*)']

  # Exclude files. A list of regular expressions to match. Filebeat drops the files that
  # are matching any regular expression from the list. By default, no files are dropped.
  #prospector.scanner.exclude_files: ['.gz$']

  # Optional additional fields. These fields can be freely picked
  # to add additional information to the crawled log files for filtering
  #fields:
  #  level: debug
  #  review: 1

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ======================= Elasticsearch template setting =======================

setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false


# ================================== General ===================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
fields:
   env: pre-production
   Application: pass2
   Data: filebeat

# ================================= Dashboards =================================
# These settings control loading the sample dashboards to the Kibana index. Loading
# the dashboards is disabled by default and can be enabled either by setting the
# options here or by using the `setup` command.
#setup.dashboards.enabled: false

# The URL from where to download the dashboards archive. By default this URL
# has a value which is computed based on the Beat name and version. For released
# versions, this URL points to the dashboard archive on the artifacts.elastic.co
# website.
#setup.dashboards.url:

# =================================== Kibana ===================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  #host: "localhost:5601"

  # Kibana Space ID
  # ID of the Kibana Space into which the dashboards should be loaded. By default,
  # the Default Space will be used.
  #space.id:

# =============================== Elastic Cloud ================================

# These settings simplify using Filebeat with the Elastic Cloud (https://cloud.elastic.co/).

# The cloud.id setting overwrites the `output.elasticsearch.hosts` and
# `setup.kibana.host` options.
# You can find the `cloud.id` in the Elastic Cloud web UI.
#cloud.id:

# The cloud.auth setting overwrites the `output.elasticsearch.username` and
# `output.elasticsearch.password` settings. The format is `<user>:<pass>`.
#cloud.auth:

# ================================== Outputs ===================================

# Configure what output to use when sending the data collected by the beat.

# ---------------------------- Elasticsearch Output ----------------------------
#output.elasticsearch:
  # Array of hosts to connect to.
  #hosts: ["localhost:9200"]

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  #username: "elastic"
  #password: "changeme"

# ------------------------------ Logstash Output -------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["xxxxxx:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  
  # Certificate for SSL client authentication
  
  # Client Certificate Key

Have you check firewall?
Also, most likely file registry already contain record that files have been read. You can delete or change filebeat.registry.path:

Please use < /> icon for formatting text.

For some reasons, both the metric beat and file beat data are not getting sent to the logstash server .

This is the error i get in the metric beat log files

ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: read tcp 1xxxx:5044: wsarecv: An existing connection was forcibly closed by the remote host.
2023-11-14T13:22:58.655Z ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: read tcp 1xxx->1xx:5044: wsarecv: An existing connection was forcibly closed by the remote host.
2023-11-14T13:22:58.655Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2023-11-14T13:22:58.655Z INFO [publisher] pipeline/retry.go:223 done
2023-11-14T13:22:58.655Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2023-11-14T13:22:58.655Z INFO [publisher] pipeline/retry.go:223 done
2023-11-14T13:22:58.655Z ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: client is not connected
2023-11-14T13:22:58.655Z INFO [publisher] pipeline/retry.go:159 Drop batch
2023-11-14T13:23:00.578Z ERROR [publisher_pipeline_output] pipeline/output.go:180 failed to publish events: client is not connected

telnet to logstash port 5044 is working fine .

any one has any thoughts about this file beat issue

Share your logstash.conf the logstash pipeline file

That input is not enabled

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.