Pipe syslog into Redis

Hello,

I am just starting out with ELK and have a question that I can't seem to find an answer to. This is my first post, so I hope this is the right spot to ask. I've looked at the Logstash Book by James Turnbull and searched here and Google but I am still unable to find the answer to, How do I get syslog messages into Redis?

In the Logstash book, it explains how to add the syslog input to the Logstash central.conf file but from my understanding that's not going through Redis. Is this correct? Wouldn't you want syslog messages to go through a broker instead of directly to the Logstash Indexer? If not, what's the best practice method for handling syslog messages?

One of my goals with this project is to get our Cisco devices or any other device that has to use syslog to ship messages to the Logstash Indexer. Maybe Redis is not the route to go but most of my research suggests that there be some sort of broker (Redis, RabbitMQ, Kafka) in between the shipper and Indexer.

Any help or suggestions on this topic will be much appreciated. Thanks for your time.

A Logstash pipeline consists of one or more inputs, one or more outputs, and optionally one or more filters. You have one input (the syslog) one so now you need to add a redis output to get the behavior you want.

Magnus, thanks for the reply.

Just so I am clear. Are you suggesting I configure the Logstash Indexer config file to input syslog and then output to Redis?
Wouldn't that defeat the purpose of having Redis? I could be wrong, as I am just starting out but I would think the desired behavior would be to have all Shipper messages buffered through Redis before getting to the Logstash Indexer.

I'm suggesting that you have one Logstash instance to receive syslog messages and send them to Redis and one instance to read from Redis and do whatever it is that you want to do with your log messages.

You could do all of this with a single instance, but since a Logstash instance only has one pipeline a multi-purpose configuration tends to get rather convoluted.

1 Like

Sounds like you just want to target your Cisco devices at a syslog compatible receiver.

You could use the syslog input: https://www.elastic.co/guide/en/logstash/master/plugins-inputs-syslog.html and then have it ship directly to Elasticsearch.

You could also configure a more general central syslogd server and then have it log everything to disk and then run logstash with a file input and read them from there.

The redis, rabbitmq, kafka stuff really comes into play when you have a large enough volume of logs that you want to move them all through a central service and have dedicated "indexers", aka logstash processes that write to elasticsearch. Usually in that setup you have logstash read from various inputs, a messager service (rabbitmq, kafka, redis etc) or even just from straight inputs ie TCP or syslog and then write to elasticsearch.

You would decouple the cisco syslog -> syslogd -> file -> logstash steps.

You could do something as simple as cisco syslog -> logstash server with syslogd input enabled -> logstash elasticsearch output -> elasticsearch or cisco syslog -> syslogd service that writes to a file -> logstash file input -> logstash elasticsearch output -> elasticsearch.

or even more robust: cisco syslog -> syslogd service -> logstash file input -> logstash kafka output -> kafka broker -> logstash kafka input -> logstash elasticsearch output -> Elasticsearch

etc.

Hope this helps flesh out the idea more.

Thanks for the suggestions Magnus and Joe! I'll probably play around with it this week and see what works best.