Syslogging a Unifi Stack with Elastic Stack

So I wanted to start by stating that I am very new to Elastic Stack and I've been in IT for one year so my understanding of the way it works is very basic. I have completed the setup basic operations of Elastic Stack on a Windows Server 2016.

Here is the guide I used and went all the way through to Step 23 for reference.

My goal is to have Elastic Stack listening to logs from our UniFi Security Gateway XG-8 and there are settings in Unifi to set the IP and Port for a syslogging server. IP is pretty straight forward, but I'm not really sure what port I should send it through so that Logstash catches it.

Do I need to set this up with one of the beats?

Any help with this is much appreciated since my research time has been dramatically reduced because of recent events.

Hi,

The easiest way to do it would be to use filebeat as a secondary logshipper.

Using the following example -

[UniFi SG] ---> [Syslog Server > unifi.log > Filebeat] ----> [Logstash Server]

Specific products might not be able to send logs directly to logstash the best solution here is to first configure a basic Syslog Server get your log and then forward them using filebeat.

I can help with documentation if you have any questions.

This looks like a good place to start. I will look into getting Filebeat all setup.

Part of the issue I am running into is that once I've pointed it to the right location, how am I going to know that the log data is getting collected? It seems that the indexes are looking for terms inside of the log data.

If I am unfamiliar with Unifi's log terminology will Elasticsearch be able to show me common terms that it's receiving?

Filebeat has a syslog input, so you can take it directly to that - https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-syslog.html

1 Like

If you make requests from Kibana to your Elasticsearch database for terms you'll have to parse the log message in order for Elasticsearch to build a document with fields and data.

Take a look at this documentation https://www.elastic.co/guide/en/logstash/current/getting-started-with-logstash.html

That seems to be the way to do it and I did set up the input as follows in filebeat.yml file.

    - type: syslog

      protocol.udp:

        host: "localhost:9000"

I also commented out output for Elasticsearch and uncommented Logstash output with the host and mapped port. I have the Unifi Controller pointing to the host with the port 9000 and I'm not seeing a Filebeat Index Pattern in Kibana.
I know it must be a configuration issue because I pointed the controller to the same host different port for Syslog Watcher and it was pulling logs just fine.

Any guidance? Does the syslog input require more variables like max_message_size?

Hi, can you provide full configuration for beats and logstash ?

Here is Logstash's logstash.conf file:

input {
  beats {
   port => 5044
   type => "syslog"
  }
}

output {
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{+yyyy.ww}"
    document_type => "%{[@metadata][type]}"
  }
}

Here is filebeat.yml file:

filebeat.inputs:

- type: syslog
  protocol.udp:
    host: "localhost:9000"
    
    paths:
    - C:\ProgramData\filebeat\logs\*.log

filebeat.config.modules:

    path: C:\ProgramData\Elastic\Beats\filebeat-7.6.1-windows-x86_64\modules.d\*.yml

    reload.enabled: false

setup.template.settings:

    index.number_of_shards: 1

setup.kibana:

    host: "localhost:5601"

output.logstash:

    hosts: ["localhost:5044"]

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

Anything you don't see is commented out or not configured.

Let me know if you need any further details.

Here is the new filebeat.yml that I setup. It's not picking up the logs from the Unifi Controller.

filebeat.inputs:
  - type: syslog
    enabled: true

protocol.udp:
  host: "localhost:514"

filebeat.config.modules:
  path: C:\ProgramData\Elastic\Beats\filebeat\modules.d\*.yml

reload.enabled: true
reload.period: 10s

setup.template.settings:
  index.number_of_shards: 1

output.logstash:
  hosts: "localhost:5044"

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.