Remote Filebeat can't send logs to Logstash

Hello.

I'm having some problems understanding how to connect a filebeat from another server to my ELK server.
I have a filebeat running on the ELK server with no issue and kibana is showing me all the logs needed, but I constantly get errors from my remote filebeat logs.

Here is the log in question

 ERROR   pipeline/output.go:100  Failed to connect to backoff(async(tcp://<ELKIP>:5044)): lookup <ELKIP> on [::1]:53: read udp [::1]:51277->[::1]:53: read: connection refused

Here is my logstash config

input {
  beats {
    port => 5044
    ssl =>false
  }
}
    filter {
  if [fileset][module] == "system" {
    if [fileset][name] == "auth" {
      grok {
        match => { "message" => ["%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][$
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]$
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]$
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]$
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]$
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]$
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]$
        pattern_definitions => {
          "GREEDYMULTILINE"=> "(.|\n)*"
        }
        remove_field => "message"
      }
      date {
        match => [ "[system][auth][timestamp]", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      }
      geoip {
        source => "[system][auth][ssh][ip]"
        target => "[system][auth][ssh][geoip]"
      }
    }
    else if [fileset][name] == "syslog" {
      grok {
        match => { "message" => ["%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{SYSLOGHOST:[system$
        pattern_definitions => { "GREEDYMULTILINE" => "(.|\n)*" }
        remove_field => "message"
      }
      date {
        match => [ "[system][syslog][timestamp]", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      }
    }
  }
}

    output {
  elasticsearch {
    hosts => ["127.0.0.1:9200"]
    manage_template => false
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
}

And here are the two filebeat

the one local to the ELK server

output.logstash:
hosts: ["localhost:5044"]

the remote one

output.logstash:
hosts: ["<ELKIP>:5044"]

All the rest was not modified and the local filebeat works.

I can also ping and telnet to 5044 to ELK server from remote server.

@Wellguys Is there any errors in the Logstash logs?

No problem from Logstash. From its perspective there is just one filebeat sending him data.

I started all over again and found the problem. I had to disable SSL for logstash or remote filebeat could not access without the correct certificate.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.