Shipping logs of system Linux with Filebeat and Logstash

Hello,

I am setting up a log monitoring architecture, for this I am using Opensearch and Kibana to collect all the data. The data is sent by Filebeat, which retrieves the logs from Wazuh (with different agents).

I then added Logstash, in order to have an additional filtering layer.
However, when viewing data on Opensearch, the data is very poorly parsed.
For example, during an alert displayed in the logs, I get this on OpenSearch :

The alert is created by an agent located at a workstation, following an SSH connection request.

Here is the content of the Logstash configuration file. I tried to put filters, but it didn't change anything :

input {
    beats {
        port => 5044
        tags => "filebeat"
    }
}


filter {
        grok {
                match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GRE$
        }
        date {
                match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]        }
        mutate {
                convert => {
                "response" => "integer"
                "bytes" => "integer"
                }
        }
}

output {

        if "filebeat" in [tags] {
                opensearch {
                hosts => ["https://localhost:9200"]
                index => "wazuh-%{+YYYY.MM.dd}"
                user => "admin"
                password => "admin"
                ssl => true
                ssl_certificate_verification => false
                }
        }
        stdout { codec => rubydebug }
}


Does anyone have any idea how to do this?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.