Logstash Shipping All Logs Except One

Hello,

I have a working Logstash configuration that ships syslog and Apache logs to a Graylog2 server - we love it, works great. Now we need to scrape an application's log file and ship it to the Graylog server too. I updated the Logstash configuration file and the syslog and Apache logs continue to flow, but the new logs only show up on the Graylog server under strange circumstances.

I had a problem with the grok filter for the new log. When entries were being tagged with _grokparsefailure they would show up on the Graylog server only if I ran Logstash in the foreground ( /opt/logstash/bin/logstash -f /etc/logstash/conf.d/logstash.conf -l /tmp/mylog.out ). If I run it in daemon mode it ships syslog and Apache, but not myapp. Worked in foreground, did not work in the background.

Then, I fixed the grok filter problem. Now, when I run Logstash in debug mode I no longer see the _grokparsefailure errors - but myapp logs aren't reaching the server. Syslog and Apache is still flowing to the Graylog server, but I only get silence from myapp.

At this point I'm stuck. myapp is generating the logs correctly on the local file system. Apache and syslog continue to get pushed to the Graylog server so it isn't an infrastructure issue. I believe it's a bug or, more likely, a misconfiguration by a newbie (me).

Any ideas on where to go next? Thanks in advance for your time!

tl;dr Added new log file to Logstash. It will push the log to the server only when Logstash is running in the foreground and when the grok filter misses and spits out _grokparsefailure. If I fix the grok filter, the logs are not sent to the server regardless of Logstash running the foreground or background. Existing logs continue to flow regardless of configuration.

My config file. Please note I'm running Logstash 1.4.5 and upgrading might be quite an uphill battle.

input {
syslog {
type => "syslog"
port => 5514
}

file {
type => "apache-error"
path => "/var/log/apache2/apache_error.log"
add_field => [ "facility_label", "apache/error" ]
}
file {
type => "mongos"
path => "/var/log/mongos/mongos.log"
add_field => [ "facility_label", "mongos" ]
}
file {
type => "myapp"
path => "/home/myapp/myapp/logs/current"
add_field => [ "facility_label", "myapp" ]
}
}

filter {
if [type] == "apache-access" {
grok {
pattern => "%{COMBINEDAPACHELOG}"
}
} else if [type] == "mongodb" {
grok {
match => { message => "nscanned:(?[0-9]+).?nreturned:(?[0-9]+).?(?[0-9]+)ms" }
}
} else if [type] == "myapp" {
grok {
match => { message => "%{TIMESTAMP_ISO8601:time} [%{UUID:uuid}] %{GREEDYDATA:message}" }
}
}

mutate {
replace => [ "@source_host", "myapp-server-01" ]
}
}

output {
gelf {
host => "12.34.56.78"
port => "12201"
}
}

Any ideas on this one? To my newbie eyes it looks correct but the logs just aren't flowing.

One final bump. The Mongo logs are flowing too - it's just the damn 'myapp' logs that aren't getting pushed to the server.

Thanks in advance.