General guidance needed, multiple log files

Hi,

I'm new to Logstash and need some guidance on input from multiple log files.

I would like to send all of my logs (i.e. messages, secure, httpd ...) to ELK.

What I have found is that if I sent them through rsyslog they will endup as a one collective stream in Kibana/Elastic.

How do I divide them to be recognised as messages,secure httpd, and so on. as the moment they all are tagged as type: syslog.

Do I need to create multiple config files with different filters on input? But then do I need to send every single file to a different port?

What's the best practice in this case?

How do I divide them to be recognised as messages,secure httpd, and so on. as the moment they all are tagged as type: syslog.

You can differentiate different types of messages either by setting different types (they don't all have to be syslog) or you can set another field that indicates what type of message it is. The input plugins can add arbitrary fields via the add_field parameter.

Do I need to create multiple config files with different filters on input?

Keep in mind that all configuration files are concatenated by Logstash (in alphabetical order) to effectively form a single large file. Separating configuration stanzas into different files only has a meaning to you as a sysadmin. If you want different filters to apply to different types of messages you need to use conditionals. A filter placed in one file will otherwise still affect all messages passing through Logstash.

But then do I need to send every single file to a different port?

Are you talking about when you send to Logstash (from the syslog daemon or similar) or when Logstash sends the message elsewhere?

One approach that we use is to use a message queuing system instead of syslog. That way if there are network interruptions, the logs are buffered for a time.

We use kafka, and create a seperate 'topic' for each type of web server (web,database,mail). The in logstash we have 3 configs that pull from the respective topics.

You could alternatively put the logstash agent on each of your nodes, and have logstash 'tag' the data before it sends it out to syslog, redis, kafka ect. There is a great puppet module and chef cookbook that make this pretty easy to do.

Thank you @magnusbaeck and @spuder for your advise.

I started experimenting with logstash_forwarder aka lumberjack rather than rsyslog as it was easier for me to tag the logfiles although I guess the same can be somehow archived with rsyslog.