Hi all,
I'm currently researching logstash (along with filebeat) as a possible solution for a problem we're seeing.
We currently have a log server that is acting as a central manager, with a bunch of different Linux and windows servers sending their logs to this log server where Wazuh aggregates them before Rsyslog sends them to a remote QRadar server.
The problem that we're seeing is that Wazuh is appending it's own server details to the front of the log and therefore QRadar is seeing everything as coming from the Log Server not the original host. I've been told that Logstash may help me resolve this problem.
I've manage to configure Filebeat and Logstash to work together in interpreting an extract of one of these logs and at the moment, it's spitting it out into another logfile (I just need to work out how to format it).
So my questions are: As I'm pulling from a log file, do I need to set filebeat as type filestream?
Do I need to use Grok in my pipeline.conf file to configure it and if so, can anyone point me to some good tutorials to understand this?
Thank you everyone in advance,
Ombit