Guidance on filtering in the logstash.conf file

Hi, I've spent a couple of days looking at filtering and probably just cannot get the basic nuances for what I want to do to make it work.

Basic set up. I have an auditbeat running on a linux server sending over to logstash on another linux server that has ELK set up on it. We want to do the filtering on the ELK server rather than at the auditbeat end.

Set-up in audit beat is using the audit module with a number of auditd rules. These are happily being received by ELK and display happily in Kibana. Problem is that 90+% of what is being received is noise and I want to use logstash to filter out the noise before it gets to ES.

I've tried a number of filters that just seemed to be ignored and still send everything through. Examples of good filters on the internet are few and far between. What I need is an example of a working one that I can then build on and expand. It is just getting that first one that is eluding me.

In the message is audit.kernel.actor.attrs.uid and I want to filter out any that have nagios as the ID.

Any help gratefully received.

(I've not put any examples of what I've tried as I have no idea if any are in anyway close to being right!)

Hi @nlh

You can write your own custom grok to extract the pattern you are interested like the NAGIOS
ID. This will ensure removal of noise before it get passed to ES.

filter {
           grok {
                       match => { "message" => "%{NID:YourPattern}" }
                }
}

Thanks @Makra

Tried similar. for example:

filter {
  grok {
    match => { "message" => "%{audit.kernel.actor.attrs.uid:nagios}"
      drop { }
    }
  }
}

But get the following error on starting logstash:

[ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, => at line 13, column 12 (byte 321) after filter {\n grok {\n match => { "message" => "%{audit.kernel.actor.attrs.uid:nagios}"\n drop "}

line 13 is the drop.

With the following:

filter {
  drop {
    grok {
      match => { "message" => "%{audit.kernel.actor.attrs.uid:nagios}" }
    }
  }
}

Get the following error:

[ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, => at line 12, column 10 (byte 179) after filter {\n drop {\n grok "}

Line 12 is the grok.

So my guess is (and apologies) some basic understanding of how these work. Once I can get a working version, then I'm sure (OK, sincerely hope!) the rest will fall into place.

Just trying to exclude messages received by logstash so they are not sent to elastic search.

Even when I have created ones that do not error, the messages are still getting through to elastic search.

Wondering if I have just twigged on something in relation to grok. I have a predefined template in Elasticsearch and it is probably this that I am seeing that enables me to see

t @timestamp                                    February 26th 2018, 13:24:04.728
t @version                                      1
t _id                                           AWHSSN4xQjO8IU9hMV6_
t _index                                        auditbeat-6.1.2-2018.02.26
# _score                                        1
t _type                                         logs
t audit.kernel.actor.attrs.auid                 nagios
t audit.kernel.actor.attrs.egid                 nagios
t audit.kernel.actor.attrs.euid                 nagios
t audit.kernel.actor.attrs.fsgid                nagios
t audit.kernel.actor.attrs.fsuid                nagios
t audit.kernel.actor.attrs.gid                  nagios
t audit.kernel.actor.attrs.sgid                 nagios
t audit.kernel.actor.attrs.suid                 nagios
t audit.kernel.actor.attrs.uid                  nagios
t audit.kernel.actor.primary                    nagios
t audit.kernel.actor.secondary                  nagios
t audit.kernel.category                         audit-rule
t audit.kernel.data.a0                          3
t audit.kernel.data.a1                          7ffc116c68c0
t audit.kernel.data.a2                          4000
t audit.kernel.data.a3                          8
t audit.kernel.data.arch                        x86_64
t audit.kernel.data.comm                        sshd
t audit.kernel.data.exe                         /usr/sbin/sshd
t audit.kernel.data.exit                        104
t audit.kernel.data.items                       0
t audit.kernel.data.pid                         23310
t audit.kernel.data.ppid                        23308
t audit.kernel.data.syscall                     read
t audit.kernel.data.tty                         (none)
t audit.kernel.how                              /usr/sbin/sshd
t audit.kernel.key                              b64_call
t audit.kernel.record_type                      syscall
t audit.kernel.result                           success
# audit.kernel.sequence                         25,226,674
t audit.kernel.session                          16846
t beat.hostname                                 soptct61-01.abc.com
t beat.name                                     ptc-desk
t beat.version                                  6.1.2
t host                                          soptct61-01.abc.com
t metricset.module                              audit
t metricset.name                                kernel
t tags                                          beats_input_raw_event

I'm guessing that in logstash, I have to use grok to give me this mapping in the message, and then I can use that to test on the contents in the fields?

e.g. something like

 grok {
		match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} %{TEXT:version} %{TEXT:id} %{TEXT:index} %{TEXT:score} %{TEXT:type} %{TEXT:audit.kernel.actor.attrs.auid} %{TEXT:audit.kernel.actor.attrs.egid} and so on for each of the items in the message"}

Once I have done that, I should be able to directly interrogate items in the message?

I'm guessing that I have TEXT as in the syntax incorrect, but just using it as an example.

Perhaps something like this?

TIMESTAMP_ISO8601:timestamp                        February 26th 2018, 13:24:04.728
WORD:version                                       1
WORD:_id                                           AWHSSN4xQjO8IU9hMV6_
WORD:_index                                        auditbeat-6.1.2-2018.02.26
WORD:_score                                        1
WORD:_type                                         logs
WORD:audit.kernel.actor.attrs.auid                 nagios
WORD:audit.kernel.actor.attrs.egid                 nagios
WORD:audit.kernel.actor.attrs.euid                 nagios
WORD:audit.kernel.actor.attrs.fsgid                nagios
WORD:audit.kernel.actor.attrs.fsuid                nagios
WORD:audit.kernel.actor.attrs.gid                  nagios
WORD:audit.kernel.actor.attrs.sgid                 nagios
WORD:audit.kernel.actor.attrs.suid                 nagios
WORD:audit.kernel.actor.attrs.uid                  nagios
WORD:audit.kernel.actor.primary                    nagios
WORD:audit.kernel.actor.secondary                  nagios
WORD:audit.kernel.category                         audit-rule
WORD:audit.kernel.data.a0                          3
WORD:audit.kernel.data.a1                          7ffc116c68c0
WORD:audit.kernel.data.a2                          4000
WORD:audit.kernel.data.a3                          8
WORD:audit.kernel.data.arch                        x86_64
WORD:audit.kernel.data.comm                        sshd
WORD:audit.kernel.data.exe                         /usr/sbin/sshd
WORD:audit.kernel.data.exit                        104
WORD:audit.kernel.data.items                       0
NUMBER:audit.kernel.data.pid                       23310
NUMBER:audit.kernel.data.ppid                      23308
WORD:audit.kernel.data.syscall                     read
WORD:audit.kernel.data.tty                         (none)
PATH:audit.kernel.how                              /usr/sbin/sshd
WORD:audit.kernel.key                              b64_call
WORD:audit.kernel.record_type                      syscall
WORD:audit.kernel.result                           success
WORD:audit.kernel.sequence                         25,226,674
WORD:audit.kernel.session                          16846
WORD:beat.hostname                                 soptct61-01.abc.com
WORD:beat.name                                     ptc-desk
WORD:beat.version                                  6.1.2
WORD:host                                          soptct61-01.abc.com
WORD:metricset.module                              audit
WORD:metricset.name                                kernel
WORD:tags                                          beats_input_raw_event

So something like (but is not working):

filter {
  grok {
      match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:version} %{WORD:_id} %{WORD:_index} %{WORD:_score} %{WORD:_type} %{WORD:audit.kernel.actor.attrs.auid} %{WORD:audit.kernel.actor.attrs.egid} %{WORD:audit.kernel.actor.attrs.euid} %{WORD:audit.kernel.actor.attrs.fsgid} %{WORD:audit.kernel.actor.attrs.fsuid} %{WORD:audit.kernel.actor.attrs.gid} %{WORD:audit.kernel.actor.attrs.sgid} %{WORD:audit.kernel.actor.attrs.suid} %{WORD:audit.kernel.actor.attrs.uid} %{WORD:audit.kernel.actor.primary} %{WORD:audit.kernel.actor.secondary} %{WORD:audit.kernel.category} %{WORD:audit.kernel.data.a0} %{WORD:audit.kernel.data.a1} %{WORD:audit.kernel.data.a2} %{WORD:audit.kernel.data.a3} %{WORD:audit.kernel.data.arch} %{WORD:audit.kernel.data.comm} %{WORD:audit.kernel.data.exe} %{WORD:audit.kernel.data.exit} %{WORD:audit.kernel.data.items} %{NUMBER:audit.kernel.data.pid} %{NUMBER:audit.kernel.data.ppid} %{WORD:audit.kernel.data.syscall} %{WORD:audit.kernel.data.tty} %{PATH:audit.kernel.how} %{WORD:audit.kernel.key} %{WORD:audit.kernel.record_type} %{WORD:audit.kernel.result} %{WORD:audit.kernel.sequence} %{WORD:audit.kernel.session} %{WORD:beat.hostname} %{WORD:beat.name} %{WORD:beat.version} %{WORD:host} %{WORD:metricset.module} %{WORD:metricset.name} %{WORD:tags} " }
  }
  if [audit.kernel.actor.attrs.uid] == "nagios" {
    drop { }
  }
}

Sheesh it is easy once you know

filter {
  if [user][name_map][auid] == "nagios" {
    drop { }
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.