I am new to ELK and this is my first project with it.
I currently have the following setup:
1 server (CentOS 7) that has zeek installed and logs all trafic
it logs the current current day in /op/zeek/spool/logger
and then the file gets put in /opt/zeek/logs/'date'/'filename'.log.gz
( for example /opt/zeek/logs/2019-10-21/http.03:00:00-04:00:00.log.gz )
this means that daily it is written as .log and i can ship this easily with beats to elasticsearch
and it works, yay! eventhough the .log files get read as if they are one big string... its a beginning!
On 1 other server i have installed a complete ELK stack using an .rpm (on CentOS 7)
including the X-Pack
i'd like to get the .log.gz files working aswell!
because i'd like to search trough my logfiles as if it is one big database wich constantly gets bigger
so my question would be -> How should i setup my setup?
do i need to let beats send te .log.gz files to logstash, let logstash handle the .gz part and then let logstash handle the .log part to make it understandable for elasticsearch (+ kibana)?
because if i import raw .log files llike i'm currently doing, it just reads these files as if they are
one big string.
big question, big story, i hope someone can clarify more for me!
thanks in advance