Logstash not accepting Filebeat data - possible java issue?

Hey guys. I am a newbie and I'm testing out ELK. I am running Filebeat 5.6 on a server and attempting to forward some json files to Logstash (5.6). On the Filebeat server, I see the following errors:

   2017/10/03 16:04:19.881406 sync.go:85: ERR Failed to publish events caused by: read tcp filebeat_server_ip:36786->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:19.881497 single.go:91: INFO Error publishing events (retrying): read tcp filebeat_server_ip:36786->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:20.887844 sync.go:85: ERR Failed to publish events caused by: read tcp filebeat_server_ip:36788->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:20.888037 single.go:91: INFO Error publishing events (retrying): read tcp filebeat_server_ip:36788->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:22.897668 sync.go:85: ERR Failed to publish events caused by: read tcp filebeat_server_ip:36790->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:22.897797 single.go:91: INFO Error publishing events (retrying): read tcp filebeat_server_ip:36790->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:23.906004 sync.go:85: ERR Failed to publish events caused by: read tcp filebeat_server_ip:36792->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:23.906103 single.go:91: INFO Error publishing events (retrying): read tcp filebeat_server_ip:36792->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:24.909910 sync.go:85: ERR Failed to publish events caused by: read tcp filebeat_server_ip:36794->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:24.910004 single.go:91: INFO Error publishing events (retrying): read tcp filebeat_server_ip:36794->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:26.915158 sync.go:85: ERR Failed to publish events caused by: read tcp filebeat_server_ip:36796->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:26.915238 single.go:91: INFO Error publishing events (retrying): read tcp filebeat_server_ip:36796->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:27.918650 sync.go:85: ERR Failed to publish events caused by: read tcp filebeat_server_ip:36798->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:27.918745 single.go:91: INFO Error publishing events (retrying): read tcp filebeat_server_ip:36798->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:29.922387 sync.go:85: ERR Failed to publish events caused by: read tcp filebeat_server_ip:36800->logstash_server_ip:5044: read: connection reset by peer
    2017/10/03 16:04:29.922579 single.go:91: INFO Error publishing events (retrying): read tcp filebeat_server_ip:36800->logstash_server_ip:5044: read: connection reset by peer

and Logstash is throwing the following errors:

[2017-10-03T16:04:19,880][INFO ][org.logstash.beats.BeatsHandler] Exception: java.util.HashMap cannot be cast to java.lang.String, from: /filebeat_server_ip:36786
[2017-10-03T16:04:20,887][INFO ][org.logstash.beats.BeatsHandler] Exception: java.util.HashMap cannot be cast to java.lang.String, from: /filebeat_server_ip:36788
[2017-10-03T16:04:22,897][INFO ][org.logstash.beats.BeatsHandler] Exception: java.util.HashMap cannot be cast to java.lang.String, from: /filebeat_server_ip:36790
[2017-10-03T16:04:23,905][INFO ][org.logstash.beats.BeatsHandler] Exception: java.util.HashMap cannot be cast to java.lang.String, from: /filebeat_server_ip:36792
[2017-10-03T16:04:24,909][INFO ][org.logstash.beats.BeatsHandler] Exception: java.util.HashMap cannot be cast to java.lang.String, from: /filebeat_server_ip:36794
[2017-10-03T16:04:26,914][INFO ][org.logstash.beats.BeatsHandler] Exception: java.util.HashMap cannot be cast to java.lang.String, from: /filebeat_server_ip:36796
[2017-10-03T16:04:27,918][INFO ][org.logstash.beats.BeatsHandler] Exception: java.util.HashMap cannot be cast to java.lang.String, from: /filebeat_server_ip:36798
[2017-10-03T16:04:29,921][INFO ][org.logstash.beats.BeatsHandler] Exception: java.util.HashMap cannot be cast to java.lang.String, from: /filebeat_server_ip:36800

I am also running Filebeat on two other servers (version 1.3.1) and those json files are making it to Logstash and ElasticSearch just fine.

This has been stumping me for days! Could it have something to do with the json parsing? I will include other info in my replies since I seem to have gone over the character limit.

I also ran netstat and the problem filebeat server is not making any connection. I can both ping and telnet from the problem server to the logstash server.

user@elk: $ sudo netstat -anp | grep 5044
    tcp6       0      0 :::5044                 :::*                    LISTEN      990/java        
    tcp6       0      0 logstash_server_ip:5044       working_filebeat_server_1:58250     ESTABLISHED 990/java        
    tcp6       0      0 logstash_server_ip:5044       working_filebeat_server_2:59156      ESTABLISHED 990/java 

Filebeat.yml

filebeat.prospectors:

# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.

- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /home/ubuntu/azure_json/*.json
    #- c:\programdata\elasticsearch\logs\*

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  #exclude_lines: ["^[[:alpha:]]{3}"]

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  #include_lines: ["ERR", "^WARN"]

  # Exclude files. A list of regular expressions to match. Filebeat drops the files that
  # are matching any regular expression from the list. By default, no files are dropped.
  #exclude_files: [".gz$"]

  # Optional additional fields. These field can be freely picked
  # to add additional information to the crawled log files for filtering
  #fields:
  #  level: debug
  #  review: 1

  ### Multiline options

  # Mutiline can be used for log messages spanning multiple lines. This is common
  # for Java Stack Traces or C-Line Continuation

  # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
  #multiline.pattern: ^\[

  # Defines if the pattern set under pattern should be negated or not. Default is false.
  #multiline.negate: false

  # Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
  # that was (not) matched before or after or as long as a pattern is not matched based on negate.
  # Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
  #multiline.match: after


#================================ General =====================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging

#================================ Outputs =====================================

# Configure what outputs to use when sending the data collected by the beat.
# Multiple outputs may be used.

#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
  # Array of hosts to connect to.
  #hosts: ["localhost:9200"]

  # Optional protocol and basic auth credentials.
  #protocol: "https"
  #username: "elastic"
  #password: "changeme"

#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["logstash_server_ip:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

02-beats-input.conf

input {
  beats {
    port => 5044
    codec => json
    ssl => false
    client_inactivity_timeout => 90000
  }
}


filter {
  json {
    source => "message"
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.