Hi all,
I am new to ELK and working on building up a test environment at home to play around with and get used to.
My current setup on an Ubuntu 16.04 LTS VM:
-
Elasticsearch (configured and running)
-
Logstash (not running)
-
Fluentd (configured and running)
-
Kibana (configured and running)
-
Sophos XG Home firewall with syslog output to UDP 5140 and the IP of my ELK VM.
My goal for this project is to get my syslog data from my Firewall into a searchable format in Kibana, as well as OSQuery data from my laptop. For now, I'm working on the syslog data from my Sophos XG Home firewall.
I have finally figured out how to get the syslog data from my firewall to show up in elasticsearch/kibana, but none of the fields (such as src_ip, dst_ip, url, etc) are searchable. As far as I understand it, it's because I need to create an index template for the syslogs, or map them somehow.
In fluentd, I have only been able so far to get the data to show up in elasticsearch by using a wildcard to indicate all logs forward to elasticsearch (not just specific ones, like those I want to tag as "syslog"):
# get logs from syslog
<source>
@type syslog
port 5140
tag syslog
</source>
<match *.**>
@type elasticsearch
logstash_format true
host localhost
port 9200
index_name fluentd
type_name fluentd
<buffer>
flush_interval 10s # for testing
</buffer>
</match>
That said, my primary question here is "what is the proper way to index/map this data from syslog so that it is searchable with field names and such?
Attached is a screenshot of what it looks like in Kibana.
Thanks in advance! I've been searching for info on this and don't fully understand everything I am reading, so here's to hoping someone could provide clear, easy to follow guidance.
David