ELK gathered data/rules documentation

Hello,

I had been working with logstash for some time now. I had come to the point where I have couple of event sources, I am integrating next, and training operators to work with the data.

I was wondering, if there is any kind of tool/framework that would help in integrating new sources and generating the documentation. The amount of log sources to integrate is worth to invest some (vacation) time.

So I'm facing those problems :

  • New event integration/parsing-processing
    -- Testing new event integrations (grok filters, multiline blocks, other logstash configuration rules)
    -- Avoiding regression when adding new sources (repeatable tests)
  • Documenting the events
    -- Avoiding _all search in favor of aware usage of event fields
    -- Working with events that happen rarely
    -- Informing what is integrated and how the evenets looks like

Till now I found only https://grokdebug.herokuapp.com/ for the first problem, and grok only. Unit-tests for each plugins are wonderful, but I'm talking more about integration-level tests.

The documentation can be hand-crafted, but we all know how hard is to keep that in sync with the rules, when they are kept separately. Doxygen approach that is known to php/java/C programmers would be great here.

So I've took some vacation time to think about that and make simple POC for something I would expect. Here is a result :


What is done :

  • Some unit tests POC (event sources, processing, unit-tests on results)
  • Some documentation with first auto-generations

Any feedback on approach, implementation and your experiences with that is much appreciated. My web search skills are also sometimes failing, so maybe you know resolution to that problem ?

Regards,

Very neat!

I've passed it onto the Logstash team :slight_smile: