Just wondering if anyone has had the sort of scenario I'm going to write about and how you handled it. One scenario I have specifically relates to using filebeat to get chef logs off servers and into ElasticSearch. There's a few different goals/outcomes I'd want to achieve with this, such as:
- Having a focused set of data when fatal converge errors happen (so I'm capturing the errors and their associated information)
- Having another set of focused data around deprecation errors/warnings (so I'm capturing when people in the organisation are using cookbook functionality that isn't appropriate)
- Capturing the full log for general analysis over a limited time (not kept long term)
Each outcome is based off the same chef log file. The question on approach is where the splicing up of these 3 views happens. From my experience with ElasticSearch and other components so far, some options that come to mind are:
- Throw it all in the same index and hopefully be able to achieve the outcomes via the tools in Kibana
- Have individual indexes for each outcome (and deconstruct the log either in Filebeat or Logstash before passing it onto the relevant index)
Any thoughts, experiences on this sort of situation would be appreciated.