Hello everybody!
Im new using ELK, so I have some doubts in some stuff I´m trying to do.
I´m sending logs from my Firewall to a Rsyslog server. I was sending logs throught Filebeat,but when I´m visualizing data in Kibana, I only see one field called "system.syslog.message with all the i,formation, due to its unstructured.
The thing is, I need to get fields from that message to make better visualizations in Kibana. Logs are different each one, so I need to extract the fields that Im interested in each case.
For example, log could be as follow:
Mar 14 18:01:38 192.168.1.6 date=2019-03-14 time=18:01:38 devname=ELK devid=1234 logid=001 type=event subtype=vpn level=information vd=root logdesc="SSL VPN tunnel down" action="tunnel-down" tunneltype="ssl-web" tunnelid=1234 remip=10.10.10.11 user="david-beckam" group="Football" dst_host="N/A" reason="N/A" duration=218 sentbyte=0 rcvdbyte=0 msg="SSL tunnel shutdown"
So, in that message I need to extract some fields to prepare a Dashboard to monitoring...
Fields like, action,logdesc,remip,user,group,duration, time and date. With that fields, I could prepare Dashboard and check if a user was connected, how much time it was, and so so.
I think the way to do it is using Logstash as filter, but I don´t know exactly what to do and how.
Should I use grok or dissect to do it? Someone could give me like an example to do it right?
Could I send to Elasticsearch like different kinds of logs? I mean, from the same log file, I need to extract different fields depending of the log line struct because each line of log file can be different.
Thank you very much to all of you!
Best Regards