Filter Logs from Firewall

Hello everybody!

Im new using ELK, so I have some doubts in some stuff I´m trying to do.
I´m sending logs from my Firewall to a Rsyslog server. I was sending logs throught Filebeat,but when I´m visualizing data in Kibana, I only see one field called "system.syslog.message with all the i,formation, due to its unstructured.
The thing is, I need to get fields from that message to make better visualizations in Kibana. Logs are different each one, so I need to extract the fields that Im interested in each case.
For example, log could be as follow:
Mar 14 18:01:38 date=2019-03-14 time=18:01:38 devname=ELK devid=1234 logid=001 type=event subtype=vpn level=information vd=root logdesc="SSL VPN tunnel down" action="tunnel-down" tunneltype="ssl-web" tunnelid=1234 remip= user="david-beckam" group="Football" dst_host="N/A" reason="N/A" duration=218 sentbyte=0 rcvdbyte=0 msg="SSL tunnel shutdown"

So, in that message I need to extract some fields to prepare a Dashboard to monitoring...
Fields like, action,logdesc,remip,user,group,duration, time and date. With that fields, I could prepare Dashboard and check if a user was connected, how much time it was, and so so.

I think the way to do it is using Logstash as filter, but I don´t know exactly what to do and how.
Should I use grok or dissect to do it? Someone could give me like an example to do it right?
Could I send to Elasticsearch like different kinds of logs? I mean, from the same log file, I need to extract different fields depending of the log line struct because each line of log file can be different.

Thank you very much to all of you!

Best Regards :slight_smile:

Im trying the following code for the same example I said before, but it doesn´t work...

input {
file {
path =>"/var/log/messages"
start_position => "beginning"

filter {

dissect {
mapping => {
"message" => "%{month} %{+day} %{+time} %{iprouter} date=%{+fecha} time=%{hora} devname=%{devname} devid=%{devid} logid=%{logid} type=%{type} subtype=%{event} subtype=%{subtype} level=%{level} vd=%{vd} logdesc=%{logdesc} action=%{action} tunneltype=%{tunneltype} tunnelid=%{tunnelid} remip=%{remiip} user=%{user} group=%{group} dst_host=%{dst_host} reason=%{reason} duration=%{duration} sentbyte=%{sentbyte} rcvdbyte=%{receivedbyte} msg=%{mensaje}"}

output {
elasticsearch { hosts => ["ip:9200"]}

You can use dissect to parse the first two elements, and a kv filter to parse the rest

dissect { mapping => { "message" => "%{[@metadata][ts]} %{+[@metadata][ts]} %{+[@metadata][ts]} %{ip} %{[@metadata][restOfLine]}" } }
kv { source => "[@metadata][restOfLine]" }
date { match => [ "[@metadata][ts]", "MMM dd HH:mm:ss" ] }

Sorry for delay.

Im gonna try :slight_smile: Thank you!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.