Filtering Issues with Logstash


(Zeal Vora) #1

Hi

I am new to ELK stack and have setup basic ELK Stack. I am using fluentd to foward logs to Logstash.

Here is my basic configuration file for logstash.

input {

 syslog {

 type => syslog

 port => 5141

}

}

output {

stdout { }

elasticsearch {

}

}

Currently as I am using syslog plugin in " input ". The syslog automatically segregates the data and i get nice formatted data in Kibana.

But if i send audit.logs , as it's not part of syslog, the syslog does not parse them properly.

What if i want to parse the audit.logs coming from remote servers with grok plugin ? The logs are coming from remote server to port 5141.

If i add grok in filter section with appropriate pattern, will grok leave aside the syslog logs and parse the audit.logs that are coming via syslog input block ?


(Magnus Bäck) #2

To apply different filters to different kinds of messages, use conditional blocks.

filter {
  if ... {
    grok {
      ...
    }
  }
}

Since all messages arrive on the same port and the same input I'm not sure what condition you should use when choosing the filter(s) to use. With examples of both kinds of messages I'd be in a better situation to help.


(Zeal Vora) #3

Thanks. What would be the ideal way to do it instead of sending all data to the same port ?

Should i use multiple ports where 1 port will receive syslog messages and other port for different kind of messages ? Not sure.


(Magnus Bäck) #4

There is no perfect and ideal way. It depends on your data and what makes sense in your situation. You could definitely use two different ports but if that's easier than writing a regular expression to distinguish between the different kinds of messages.


(system) #5