Logstash forwarding MSFT Server logs into Azure

Hoping someone here has tried to do this same thing.

We are using Logstash as a forwarder of linux & windows logs into MSFT Azure Sentinel.

We do segregate (or TAG) the logs as "windows" and "linux". Linux is fine and it's a simple rsyslog configuration to send to Logstash who then send to Azure.

Microsoft...we use nxlog to send system and security logs to Logstash, who sends them on to Azure as JSON, it all works, however...

Azure Sentinel has a limit of 500 columns on a "log type" and the Windows servers routinely blow that out and then Azure starts throwing their logs on the floor.

Has anyone to any kind of filtering on the logstash side to get rid of all the "junk" from the Windows servers that nobody needs ?

Thanks for any ideas, pointers, help.

Jim

If you are dealing with top level fields then you may be able to do it with a prune filter, either with blacklist_names or whitelist_names.

Another option would be to create a file that lists all the fields you want to keep

@timestamp
@version
host
message
path
field1
field3
    ruby {
        init => '
            @fields = {}
            File.foreach( "/home/user/fields.txt" ) do |line|
                @fields[line.chomp] = true
            end
        '
        code => '
            event.to_hash.each { |k, v|
                unless @fields[k]
                    event.remove(k)
                end
            }
        '
    }

If nested fields are involved then it gets much more complicated.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.