Filebeat with sophos module

I am very new to elk.
I have managed to install Elasticsearch,kibana and filebeat in ubuntu server, managed to enable sophos module and manged to receive syslog messages from the appliance using rsyslog server. But I am not getting this data in to Elasticsearch.
Can some one help me with this? below are my sophos module config and filebeat.yml.


  • module: sophos
    enabled: true
    var.input: file
    var.paths: ["/var/log/hostname/*.log"]

    Set which input to use between tcp, udp (default) or file.

    The interface to listen to syslog traffic. Defaults to

    localhost. Set to to bind to all available interfaces.

    var.syslog_host: localhost

    The port to listen for syslog traffic. Defaults to 9004.

    var.syslog_port: 514

    firewall default hostname

    var.default_host_name: firewall.localgroup.local

    known firewalls

    - serial_number: "SL no of FW appliance"
    hostname: "host name of FW appliance"



Each - is an input. Most options can be set at the input level, so

you can use different inputs for various configurations.

Below are the input specific configurations.

  • type: log

Change to true to enable this input configuration.

enabled: true

Paths that should be crawled and fetched. Glob based paths.

- /var/log/*.log
#- c:\programdata\Elasticsearch\logs*

Exclude lines. A list of regular expressions to match. It drops the lines that are

matching any regular expression from the list.

#exclude_lines: ['^DBG']

Include lines. A list of regular expressions to match. It exports the lines that are

matching any regular expression from the list.

#include_lines: ['^ERR', '^WARN']

Exclude files. A list of regular expressions to match. Filebeat drops the files that

are matching any regular expression from the list. By default, no files are dropped.

#exclude_files: ['.gz$']

Optional additional fields. These fields can be freely picked

to add additional information to the crawled log files for filtering


level: debug

review: 1

Multiline options

============================== Filebeat modules ==============================


Glob pattern for configuration loading

path: ${path.config}/modules.d/*.yml

Set to true to enable config reloading

reload.enabled: false

Period on which files under path should be checked for changes

#reload.period: 10s

======================= Elasticsearch template setting =======================

index.number_of_shards: 1
#index.codec: best_compression
#_source.enabled: false

=================================== Kibana ===================================

Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.

This requires a Kibana endpoint configuration.


Kibana Host

Scheme and port can be left out and will be set to the default (http and 5601)

In case you specify and additional path, the scheme is required: http://localhost:5601/path

IPv6 addresses should always be defined as: https://[2001:db8::1]:5601

#host: "localhost:5601"

Kibana Space ID

ID of the Kibana Space into which the dashboards should be loaded. By default,

the Default Space will be used.

---------------------------- Elasticsearch Output ----------------------------


Array of hosts to connect to.

hosts: ["localhost:9200"]

Protocol - either http (default) or https.

#protocol: "https"

Authentication credentials - either API key or username/password.

#api_key: "id:api_key"
#username: "elastic"
#password: "changeme"

------------------------------ Logstash Output -------------------------------


The Logstash hosts

#hosts: ["localhost:5044"]

Optional SSL. By default is off.

List of root certificates for HTTPS server verifications

#ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

Certificate for SSL client authentication

#ssl.certificate: "/etc/pki/client/cert.pem"

Client Certificate Key

#ssl.key: "/etc/pki/client/cert.key"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.