I'm using Monolog with PHP, using beaver to forward the logs to a redis queue, and redis feeds the log in queue to logstash. I've monitored every step up to logstash and everything looks normal. When logstash gets its loggy little hands on it, it creates one beautiful entry for elastic search, and one completely worthless entry(every value is null except for timestamp, which looks like it's being generated on insert rather than passed along), for each error log. Anyone seen this before? It's not the biggest deal since I can query with kibana to see only the logs that actually have data, I just don't understand why/how these trash entries are being generated along with the successful parsing/inserting.
mtopolski (Mtopolski) #1
warkolm (Mark Walkom) #2
Providing your config would help
system (system) #3