Send LSF to Redis to Logstash

I am able to send windows server 2012 logs to logstash with lumberjack with the following configs below, but I want to now send to redis from windows server 2012 and then to logstash.

lumberjack config:

{
"network": {
"servers": [ "logstash-server:5000" ],
"timeout": 15,
"ssl ca": "path"
},
"files": [
{
"paths":
[ "C:\ProgramData\Solarwinds\Logs\SEUM\SEUM.jobs.log", "C:\ProgramData\Solarwinds\Logs\orion\orionWeb.log", "C:\ProgramData\Solarwinds\Logs\Orion\Core.BusinessLayer.log" ],
"fields": { "type": "syslog" }
}
]
}

Logstash Server config

input {
lumberjack {
port => 5000
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
output {
elasticsearch {
host => "elasticsearch-ip"
cluster => "elasticsearch_TEST" }
stdout { codec => rubydebug }
}
Output {
elasticsearch {
host => "elasticsearch-server"
cluster => "elasticsearch_TEST" }
stdout { codec => rubydebug }
}

LSF only supports the Lumberjack protocol.

What other options do I have if I want to use redis as a broker?

I'm not aware of any shippers, apart from Logstash itself, that run on Windows and support Redis.

Okay so use a logstash client (configured as a shipper) and send to redis?

How about sending windows server 2012 logs through the following flow?

lumberjack > Logstash (collector) > Redis > Logstash (indexer) > Elasticsearch

That will work.

I am trying to setup the following flow to send logs to Elasticsearch.

lumberjack > Logstash (collector) > Redis > Logstash (indexer) > Elasticsearch

However, I am having some troubles sending from Lumberjack (windows serv 2012) > logstash (shipper) > Redis > Logstash (indexer) configuration wise. What am I missing?

So far I have

lumberjack (Win server 2012)
{
"network": {
"servers": [ "logstash-server-name:5000" ],
"timeout": 15,
"ssl ca": "file path"
},
"files": [
{
"paths":
[ "C:\File path" }
}
]
}

Logstash (shipper/collector)
input {
lumberjack {
port => 5000
type => "logs"
ssl_certificate => "File path"
ssl_key => "file path"
}
output {
stdout { }
redis {
host => "redis server"
data_type => "list"
key => "logstash"
}
}

Logstash (indexer)
}
input {
redis {
host => "redis host"
type => "redis"
data_type => "list"
key => "logstash"
}
}
output {
stdout { }
elasticsearch {
cluster => "elasticsearch"
}
}

What's the problem? Are the messages not arriving to Redis? Or are they arriving to Redis but not being picked up by Logstash on the other end?

Logstash (shipper/collector) is not starting which I believe is a config issue possibly...

What I see in logstash.log:

{:timestamp=>"2015-10-30T12:18:28.779000-0600", :message=>"The error reported is: \n uninitialized constant Concurrent::Delay::Executor"}
{:timestamp=>"2015-10-30T12:43:26.362000-0600", :message=>"The error reported is: \n uninitialized constant Concurrent::Delay::Executor"}

Okay, so apparently I needed to update my lumberjack. Now my logstash collector is running but no logs are passing...

Not quite sure where the issue is at. Ideas?

Okay everything is sending now