Create watches for windows event logs

alerting

(Joseph Sanchez) #1

I've watched the watcher videos, and I don't think I would have many issues if I managed a Linux environment. Unfortunately, I manage a windows environment and I'm stuck on how to craft a watch for windows event logs, Does anyone have a template or know where I can find an example watch for an event log? I think I can figure out the rest, just need a little help with how to format a watch for event log.


(Steve Kearns) #2

Hi Joseph,

The first step is to get the windows event logs into Elasticsearch. If you're not already doing that, I suggest checking out Winlogbeat a lightweight data shipper that can gather events from the Windows Event log and send it to Elasticsearch.

At that point, you can use Kibana to create a dashboard over this data. Each Kibana visualization has a little up arrow on the bottom, which displays the "spy" that shows you the exact Elasticsearch query that was run to generate the chart. With that query as a starting point, you should be able to create a watch using that query as the input.

I hope that helps.

Thanks,
Steve


(Joseph Sanchez) #3

Steve,
Thanks for your response. Sorry I should have included I have the ELK Stack in production, and getting event logs from my windows boxes. I've made a few visualizations from some searches I created in discover. Nothing special just errors and other simple events. I loaded up a visualization, the up arrow, then response which displayed

{
  "size": 0,
  "aggs": {
    "2": {
      "date_histogram": {
        "field": "@timestamp",
        "interval": "1h",
        "time_zone": "America/Chicago",
        "min_doc_count": 1,
        "extended_bounds": {
          "min": 1472101200000,
          "max": 1472104799999
        }
      }
    }
  },
  "query": {
    "filtered": {
      "query": {
        "query_string": {
          "analyze_wildcard": true,
          "query": "error"
        }
      },
      "filter": {
        "bool": {
          "must": [
            {
              "query": {
                "query_string": {
                  "query": "*",
                  "analyze_wildcard": true
                }
              }
            },
            {
              "range": {
                "@timestamp": {
                  "gte": 1472101200000,
                  "lte": 1472104799999,
                  "format": "epoch_millis"
                }
              }
            }
          ],
          "must_not": []
        }
      }
    }
  },
  "highlight": {
    "pre_tags": [
      "@kibana-highlighted-field@"
    ],
    "post_tags": [
      "@/kibana-highlighted-field@"
    ],
    "fields": {
      "*": {}
    },
    "require_field_match": false,
    "fragment_size": 2147483647
  }
}

I assume this is what you mean as the query for the starting point?


(Steve Kearns) #4

Oh that's great. Yes, that is the query starting point I was referring to. You can see me walk through an example of doing exactly this (create a watch starting with a query from the Kibana spy) in a webinar from earlier this year, starting at about 17:30:

https://www.elastic.co/webinars/watcher-practical-alerting-for-elasticsearch

I hope that helps!


(Joseph Sanchez) #5

Sorry for the late reply. Seems like a great starting point thanks for your help. Side note, my boss was at a devops conference in Chicago this week and brought me back an early release of "Elasticsearch The definitive guide". I'm looking forward to learning as much as I can!


(system) #6