Cannot configure SSL connection between Filebeat and Logstash


(Sulaiman Khan) #1

I'm having trouble configuring the SSL connection between the Filebeat client (Windows 2012 R2) and the Logstash server (RH7). Filebeat and Logstash communicates just find when I set ssl to 'false' in the beats config file. However, when i set the ssl value to 'true', I generated the SSL certificate using the steps here https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-centos-7

There were no instructions on how to configure SSL between windows and linux. What I did was copy the cert on to my windows server. Specifically, in the filebeat folder.

i'm receiving the following errors.....

Microsoft Windows [Version 6.3.9600]
(c) 2013 Microsoft Corporation. All rights reserved.

C:\Windows\system32>cd c:/

c:>cd "Program Files"

c:\Program Files>cd Filebeat

c:\Program Files\Filebeat>filebeat.exe -c filebeat.yml -e -v
2016/06/16 17:46:00.059581 geolite.go:24: INFO GeoIP disabled: No paths were set
under output.geoip.paths
2016/06/16 17:46:00.060582 logstash.go:106: INFO Max Retries set to: 3
2016/06/16 17:46:00.061584 outputs.go:126: INFO Activated logstash as output plu
gin.
2016/06/16 17:46:00.061584 publish.go:288: INFO Publisher name: VAVT-PMO-SBX27
2016/06/16 17:46:00.065580 async.go:78: INFO Flush Interval set to: 1s
2016/06/16 17:46:00.066578 async.go:84: INFO Max Bulk Size set to: 1024
2016/06/16 17:46:00.066578 beat.go:147: INFO Init Beat: filebeat; Version: 1.2.3

2016/06/16 17:46:00.066578 beat.go:173: INFO filebeat sucessfully setup. Start r
unning.
2016/06/16 17:46:00.067578 registrar.go:68: INFO Registry file set to: C:\Progra
mData\filebeat\registry
2016/06/16 17:46:00.067578 registrar.go:80: INFO Loading registrar data from C:
ProgramData\filebeat\registry
2016/06/16 17:46:00.067578 prospector.go:133: INFO Set ignore_older duration to
0
2016/06/16 17:46:00.068582 prospector.go:133: INFO Set close_older duration to 1
h0m0s
2016/06/16 17:46:00.068582 prospector.go:133: INFO Set scan_frequency duration t
o 10s
2016/06/16 17:46:00.069580 prospector.go:90: INFO Invalid input type set:
2016/06/16 17:46:00.069580 prospector.go:93: INFO Input type set to: log
2016/06/16 17:46:00.070580 prospector.go:133: INFO Set backoff duration to 1s
2016/06/16 17:46:00.070580 prospector.go:133: INFO Set max_backoff duration to 1
0s
2016/06/16 17:46:00.071579 prospector.go:113: INFO force_close_file is disabled
2016/06/16 17:46:00.071579 prospector.go:143: INFO Starting prospector of type:
log
2016/06/16 17:46:00.069580 spooler.go:77: INFO Starting spooler: spool_size: 100
00; idle_timeout: 10s
2016/06/16 17:46:00.073581 log.go:113: INFO Harvester started for file: C:\Progr
am Files\IISlogs\u_ex150212 - Copy - Copy.log
16 17:46:08.090102 single.go:152: INFO send fail
2016/06/16 17:46:08.091091 single.go:159: INFO backoff retry: 4s
2016/06/16 17:46:12.094347 single.go:76: INFO Error publishing events (retrying)
: EOF
2016/06/16 17:46:12.095343 single.go:152: INFO send fail
2016/06/16 17:46:12.095343 single.go:159: INFO backoff retry: 8s
2016/06/16 17:46:20.101859 single.go:76: INFO Error publishing events (retrying)
: EOF

2016/06/16 17:46:22.617019 registrar.go:162: INFO Registry file updated. 8 state
s written.

02-beat config file:

input {
  beats {
    port => 5044
    type => "logs"
    ssl => true
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
  }
}

10-syslog-filter config:

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}

30-elasticsearch config file:

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

YML config file:

filebeat:
    prospectors:
      -
        paths:
         -"/Program Files/IISlogs/*"
          input_type: log
          document_type: syslog

    registry_file: "C:/ProgramData/filebeat/registry"
output:

    logstash:
        enabled: true
        hosts:
            -"xx.xxx.xx.xxx:5044"
        bulk_max_size:1024
        tls:
            certificate_authorities: ["/Program Files/Filebeat/logstash-forwarder.crt"]

shipper:

logging:
  level: info
  to_files: true
  to_syslog: false

  files:
    path: C:/Filebeat
    name: filebeat.log
    keepfiles: 5

(Andrew Kroh) #2

You configuration seems to be OK from what I can tell.

Have you tried validating the Logstash server certificate independent of Filebeat as described here: https://www.elastic.co/guide/en/beats/filebeat/current/configuring-tls-logstash.html#testing-tls-logstash

Also from what I recall, some of the logging around connection errors has been improved in Filebeat 5.x. So you might want to try running the 5.x Filebeat binary while you are trying to work out what the problem is.

You might want to try adding the insecure: true option to the tls config options in Filebeat. If it works then there is an issue with your certificates.


(Sulaiman Khan) #3

I got my SSL cert to work. I'm now having a problem with Logstash not recognizing my grok config file. The logs are being sent from filebeat to logstash just fine. However, in Kibana, the logs aren't being filtered according to the grok patterns.


(system) #4

This topic was automatically closed after 21 days. New replies are no longer allowed.