Watch using Slack not working (6.4.3)

I tested the same watch that use log for action and it was triggering so I remove the log and replace it with notify-slack but did not get any notification to the slack channel.

    {
      "trigger": {
        "schedule": {
          "interval": "5m"
        }
      },
      "input": {
        "search": {
          "request": {
            "indices": [
              "logstash*"
            ],
            "body": {
              "query": {
                "bool": {
                  "must": [
                    { "match": { "log": "*error*" }},
                    { "range": { "@timestamp": {
                        "gte": "now-1m",
                        "lte": "now" }}}
                  ]
                }
              },
              "_source": [
                "kubernetes.container_name",
                "kubernetes.host",
                "log"
              ],
              "sort": [
                {
                  "@timestamp": {
                    "order": "desc"
                  }
                }
              ]
            }
          }
        }
      },
      "condition": {
        "compare": {
          "ctx.payload.hits.total": {
            "gt": 0
          }
        }
      },
      "throttle_period": "5m",
      "actions": {
        "notify-slack" : {
          "throttle_period" : "5m",
          "slack" : {
            "message" : {
              "to" : [ "#channel_name", "@username" ],
              "text" : "Encountered  {{ctx.payload.hits.total}} errors in the last 5 minutes (facepalm)"
            }
          }
        }
      }
    }

In the elaticsearch.yml file the slack notification is set,

    xpack.notification.slack.account.monitoring.url: https://hooks.slack.com/services/my_webhook.../.../...

I executed the watch and did not get any error and the slack channel is not getting the notification.

I have look at all the es-master and did not see any reference to the watch.

How do I debug or get the log if the watch is executing correctly?

can you include the full run out of the Execute Watch API, otherwise debugging is basically impossible.

Thanks a lot!

Hi, Thank you for pointing me to the page to run the execute Watch with (_execute). Looking at the output I realized it was not getting trigger so slack was not getting the notification. The Watch with the 'Log' action is getting a value greater than 0 (ctx.payload.hits.total) but the Watch with the SLACK action is returning 0. Both Watch are identical with the exception of action. Not sure why the Watch with the slack action is getting 0.

Output from Watch,

    {
      "_id": "check_for_errors_SLACK_TEST_f247c2c3-c1cd-4ac2-8432-b13451f7546b-2019-03-22T17:17:29.405Z",
      "watch_record": {
        "watch_id": "check_for_errors_SLACK_TEST",
        "node": "AsHzI0hUR86YPIZpcTySDQ",
        "state": "executed",
        "status": {
          "state": {
            "active": true,
            "timestamp": "2019-03-22T17:05:17.016Z"
          },
          "last_checked": "2019-03-22T17:17:29.405Z",
          "last_met_condition": "2019-03-22T17:17:29.405Z",
          "actions": {
            "notify-slack": {
              "ack": {
                "timestamp": "2019-03-22T17:17:29.405Z",
                "state": "ackable"
              },
              "last_execution": {
                "timestamp": "2019-03-22T17:17:29.405Z",
                "successful": true
              },
              "last_successful_execution": {
                "timestamp": "2019-03-22T17:17:29.405Z",
                "successful": true
              }
            }
          },
          "execution_state": "executed",
          "version": 1
        },
        "trigger_event": {
          "type": "manual",
          "triggered_time": "2019-03-22T17:17:29.405Z",
          "manual": {
            "schedule": {
              "scheduled_time": "2019-03-22T17:17:29.405Z"
            }
          }
        },
        "input": {
          "search": {
            "request": {
              "search_type": "query_then_fetch",
              "indices": [
                "logstash*"
              ],
              "types": [],
              "body": {
                "query": {
                  "bool": {
                    "must": [
                      {
                        "match": {
                          "log": "error*"
                        }
                      },
                      {
                        "range": {
                          "@timestamp": {
                            "gte": "now-5m",
                            "lte": "now"
                          }
                        }
                      }
                    ]
                  }
                },
                "_source": [
                  "kubernetes.container_name",
                  "kubernetes.host",
                  "log"
                ],
                "sort": [
                  {
                    "@timestamp": {
                      "order": "desc"
                    }
                  }
                ]
              }
            }
          }
        },
        "condition": {
          "compare": {
            "ctx.payload.hits.total": {
              "gt": 0
            }
          }
        },
        "result": {
          "execution_time": "2019-03-22T17:17:29.405Z",
          "execution_duration": 256,
          "input": {
            "type": "simple",
            "status": "success",
            "payload": {
              "foo": "bar"
            }
          },
          "condition": {
            "type": "always",
            "status": "success",
            "met": true
          },
          "actions": [
            {
              "id": "notify-slack",
              "type": "slack",
              "status": "success",
              "slack": {
                "account": "monitoring",
                "sent_messages": [
                  {
                    "status": "success",
                    "to": "#devops-alerts",
                    "message": {
                      "from": "check_for_errors_SLACK_TEST",
                      "text": "Encountered   errors in the last 5 minutes (facepalm)"
                    }
                  }
                ]
              }
            }
          ]
        },
        "messages": []
      }
    }

You have specified an alternate input with that execution it seems. The condition does not match with the above watch, so I assume this is a completely different watch. Please make sure your snippets are up-to-date. I have no idea how to answer or help this way, as the watch and the output are out-of-sync.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.