I am new to ELK stack and have setup basic ELK Stack. I am using fluentd to foward logs to Logstash.
Here is my basic configuration file for logstash.
input {
syslog {
type => syslog
port => 5141
}
}
output {
stdout { }
elasticsearch {
}
}
Currently as I am using syslog plugin in " input ". The syslog automatically segregates the data and i get nice formatted data in Kibana.
But if i send audit.logs , as it's not part of syslog, the syslog does not parse them properly.
What if i want to parse the audit.logs coming from remote servers with grok plugin ? The logs are coming from remote server to port 5141.
If i add grok in filter section with appropriate pattern, will grok leave aside the syslog logs and parse the audit.logs that are coming via syslog input block ?
To apply different filters to different kinds of messages, use conditional blocks.
filter {
if ... {
grok {
...
}
}
}
Since all messages arrive on the same port and the same input I'm not sure what condition you should use when choosing the filter(s) to use. With examples of both kinds of messages I'd be in a better situation to help.
There is no perfect and ideal way. It depends on your data and what makes sense in your situation. You could definitely use two different ports but if that's easier than writing a regular expression to distinguish between the different kinds of messages.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.