Is there a way to setup email alerts for SlowLogs in Elasticsearch on Elastic Cloud?
-
Setup monitoring via our blog: How to set up Elastic Cloud Advice from Elastic Support
-
Enable Slow Logs. E.g. using
kibana_sample_data*
PUT kibana_sample_data*/_settings { "index.search.slowlog.threshold.query.warn": "0s" }
-
(Wait for natural event to occur or) induce Slow Log
-
Create Index Pattern against
kibana_sample_data*
-
In Discover, load all Index Pattern data (bets are will take longer than 1s & generate a SlowLog). After we'll find SlowLog outputs from
elastic-cloud-7*
E.g. NOT this output (which was us updating the settings)
{ "_index": "elastic-cloud-logs-7-2021.08.03-000003", "_type": "_doc", "_id": "v7kQonsBSqtO90ToZdm-", "_score": 1, "_source": { ...}, "fields": { ..., + "log.level": [ "INFO" ], + "elasticsearch.cluster.name": ["a06358269ba54d0599ef41636ae0f6e8"], + "service.name": ["sand_7.13"], + "message": ["[kibana_sample_data_flights] updating [index.search.slowlog.threshold.query.warn] from [-1s] to [0s]"], + "@timestamp": ["2021-09-01T15:53:00.863Z"] } }
But this output which is for a SlowLog event.
{ "_index": "elastic-cloud-logs-7-2021.08.03-000003", "_type": "_doc", "_id": "4bkkonsBSqtO90To5fqH", "_score": 1, "_ignored": ["elasticsearch.slowlog.source"], "_source": {... }, "fields": { ..., + "elasticsearch.slowlog.took": ["17.9ms"], + "log.level": ["WARN"], + "elasticsearch.cluster.name": ["a06358269ba54d0599ef41636ae0f6e8"], + "elasticsearch.slowlog.search_type": ["QUERY_THEN_FETCH"], + "service.name": ["sand_7.13"], + "fileset.name": ["slowlog"], + "elasticsearch.index.name": ["kibana_sample_data_logs"], + "message": ["{\"type\": \"index_search_slowlog\", \"timestamp\": \"2021-09-01T16:15:26,426Z\", \"level\": \"WARN\", \"component\": \"i.s.s.query\", \"cluster.name\": \"a06358269ba54d0599ef41636ae0f6e8\", \"node.name\": \"instance-0000000000\", \"message\": \"[kibana_sample_data_logs][0]\", \"took\": \"17.9ms\", \"took_millis\": \"17\", \"total_hits\": \"4149 hits\", \"types\": \"[]\", \"stats\": \"[]\", \"search_type\": \"QUERY_THEN_FETCH\", \"total_shards\": \"2\", \"source\": \"{\\\"size\\\":500,\\\"query\\\":{\\\"bool\\\":{\\\"must\\\":[{\\\"query_string\\\":{\\\"query\\\":\\\"*\\\",\\\"fields\\\":[],\\\"type\\\":\\\"best_fields\\\",\\\"default_operator\\\":\\\"or\\\",\\\"max_determinized_states\\\":10000,\\\"enable_position_increments\\\":true,\\\"fuzziness\\\":\\\"AUTO\\\",\\\"fuzzy_prefix_length\\\":0,\\\"fuzzy_max_expansions\\\":50,\\\"phrase_slop\\\":0,\\\"analyze_wildcard\\\":true,\\\"time_zone\\\":\\\"America/Denver\\\",\\\"escape\\\":false,\\\"auto_generate_synonyms_phrase_query\\\":true,\\\"fuzzy_transpositions\\\":true,\\\"boost\\\":1.0}}],\\\"filter\\\":[{\\\"range\\\":{\\\"@timestamp\\\":{\\\"from\\\":\\\"2020-09-01T06:00:00.000Z\\\",\\\"to\\\":\\\"2021-09-01T16:15:26.192Z\\\",\\\"include_lower\\\":true,\\\"include_upper\\\":true,\\\"format\\\":\\\"strict_date_optional_time\\\",\\\"boost\\\":1.0}}}],\\\"adjust_pure_negative\\\":true,\\\"boost\\\":1.0}},\\\"version\\\":true,\\\"_source\\\":false,\\\"stored_fields\\\":\\\"*\\\",\\\"fields\\\":[{\\\"field\\\":\\\"*\\\",\\\"include_unmapped\\\":true},{\\\"field\\\":\\\"@timestamp\\\",\\\"format\\\":\\\"strict_date_optional_time\\\"},{\\\"field\\\":\\\"timestamp\\\",\\\"format\\\":\\\"strict_date_optional_time\\\"},{\\\"field\\\":\\\"utc_time\\\",\\\"format\\\":\\\"strict_date_optional_time\\\"}],\\\"script_fields\\\":{},\\\"sort\\\":[{\\\"@timestamp\\\":{\\\"order\\\":\\\"desc\\\",\\\"unmapped_type\\\":\\\"boolean\\\"}}],\\\"track_total_hits\\\":2147483647,\\\"aggregations\\\":{\\\"2\\\":{\\\"date_histogram\\\":{\\\"field\\\":\\\"@timestamp\\\",\\\"time_zone\\\":\\\"America/Denver\\\",\\\"calendar_interval\\\":\\\"1w\\\",\\\"offset\\\":0,\\\"order\\\":{\\\"_key\\\":\\\"asc\\\"},\\\"keyed\\\":false,\\\"min_doc_count\\\":1}}},\\\"highlight\\\":{\\\"pre_tags\\\":[\\\"@kibana-highlighted-field@\\\"],\\\"post_tags\\\":[\\\"@/kibana-highlighted-field@\\\"],\\\"fragment_size\\\":2147483647,\\\"fields\\\":{\\\"*\\\":{}}}}\", \"id\": \"a548fadc-438e-4016-8753-42d9aa1306f3\", \"cluster.uuid\": \"ZYKnzvOCQYeNWO9tiYwQSA\", \"node.id\": \"Aaw9Rs2JT1aHBGbM1yppNg\" }"], + "@timestamp": ["2021-09-01T16:15:26.426Z"], + "event.dataset": ["elasticsearch.slowlog"] } }
-
-
We can use these fields to generate a query filter to view only these logs:
{ "query": { "bool": { "must": { "query_string": { "analyze_wildcard": true, "query": "event.dataset:elasticsearch.slowlog AND log.level:WARN" }}}}}
-
Now, we'll go to create a Kibana Alert of type Elasticsearch query with our query. More info: general rule details. I'll just do an index write, but many customers point their Actions to their Email Connectors.
-
Once created, related Kibana Alert JSON on backend will be
``` GET .kibana/_search?filter_path=hits.hits {"query":{"term":{"alert.name.keyword":{"value":"test_00796515" }}}} { "hits" : {"hits" : [{ "_index" : ".kibana_7.14.0_001", "_type" : "_doc", "_score" : 1.8382792, "_id" : "alert:5cef3390-0b42-11ec-b5ac-13d3190df5d9", "_source" : { "alert" : { "params" : { "esQuery" : """{ "query": { "bool": { "must": { "query_string": { "analyze_wildcard": true, "query": "event.dataset:elasticsearch.slowlog AND log.level:WARN" }}}}}""", "size" : 100, "timeWindowSize" : 5, "timeWindowUnit" : "m", "threshold" : [1 ], "thresholdComparator" : ">", "index" : ["elastic-cloud-logs-*"], "timeField" : "@timestamp" }, "consumer" : "alerts", "schedule" : {"interval" : "5m"}, "tags" : [ ], "name" : "test_00796515", "throttle" : null, "actions" : [{"group" : "query matched", "params" : {"documents" : [{"test" : "success, yay!"} ] }, "actionRef" : "action_0", "actionTypeId" : ".index"} ], "enabled" : true, "alertTypeId" : ".es-query", "notifyWhen" : "onActiveAlert", "apiKeyOwner" : "199699336", "apiKey" : "REDACTED", "createdBy" : "199699336", "updatedBy" : "199699336", "createdAt" : "2021-09-01T16:33:40.474Z", "updatedAt" : "2021-09-01T16:33:40.474Z", "muteAll" : false, "mutedInstanceIds" : [ ], "executionStatus" : { "status" : "active", "lastExecutionDate" : "2021-09-01T16:33:45.425Z", "error" : null }, "meta" : {"versionApiKeyLastmodified" : "7.14.0"}, "scheduledTaskId" : "5e0389c0-0b42-11ec-b5ac-13d3190df5d9"}, "type" : "alert", "references" : [{"id" : "3d72e610-0b42-11ec-b5ac-13d3190df5d9", "name" : "action_0", "type" : "action"} ], "migrationVersion" : {"alert" : "7.13.0"}, "coreMigrationVersion" : "7.14.0", "updated_at" : "2021-09-01T16:33:46.565Z" } } ] } } ```
-
Since our alert's marked
executionStatus.status:active
, we can check to see our test fired. Once confirmed we're good to go.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
If you want to do #5 via Watcher to output the actual response body, local executions would look something like
PUT _watcher/watch/_execute
{ "watch": {
"trigger": {"schedule": {"interval": "1h"}},
"input": { "search": { "request": {
"search_type": "query_then_fetch",
"indices": ["elastic-cloud-logs*"],
"rest_total_hits_as_int": true,
"body": {
"query": { "bool": { "must": { "query_string": { "analyze_wildcard": true,
"query": "event.dataset:elasticsearch.slowlog AND log.level:WARN"}},
"filter": [{"range": {"@timestamp": {
"gte": "now-1h",
"lte": "now"}}}]}}}}}},
"condition": {"compare": {"ctx.payload.hits.total": {"gt": 0 }}},
"actions": { "log" : { "logging" : {
"text" : """SlowLogs: {{#ctx.payload.hits.hits}}
message: {{_source.message}}
{{/ctx.payload.hits.hits}}""",
"level": "warn"}}}}}