I'm wondering how others are managing evolving Grok patterns / logstash configurations.
Are there any best practices / approaches to make sure that after making changes in a logstash configuration messages that have been parsed properly before are still being parsed properly?
In a "perfect" world I imagine having some kind of junit test where I can define inputs and expected outputs for a given logstash configuration and run junit in eclipse and/or jenkins to see if something broke.
I recommend a tool I've written, Logstash Filter Verifier. It lets you define inputs that it runs through (possibly a subset of) your Logstash filters and compares the result with what you have defined as the expected output.
What's currently lacking is Logstash 5 support, but I hope I'll be able to address that during the next month. Depending on the kind of filters you have it's possible that you'll be able to use Logstash 2.4 with it until Logstash 5 support is available.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.