Watcher email and slack failing

Hello,

I am trying to configure slack and email action in watcher on my ES and Kibana 7.12 installed on Amazon Linux AMI.

below are my slack settings (plus webhook url in keystore):

xpack.notification.slack:
  default_account: watcher
  account:
    watcher:
      message_defaults:
        from: kibana

I am setting up Amazon SES email configs like this (plus password in keystore):

xpack.notification.email.account:
    ses_account:
        smtp:
            auth: true
            starttls.enable: true
            starttls.required: true
            host: email-smtp.us-east-1.amazonaws.com
            port: 587
            user: AKIAW25*********53U

When I try to send sample email or slack message, it fails.
I want to trace the reason in the log file but /var/log/elasticsearch is not being accessible and /var/log/elasticsearch/elasticsearch.log is not showing latest events or info, probably there are multiple log files.

How and where can I access the email / slack related error details?
Also if there is anything I am missing in watcher action configs?

Thanks

Couple questions watcher is a Gold+ license feature what level license do.you have.

Also have you considered using the new Kibana alerting framework instead of watcher?

Finally you can test watcher via the dev console / or get the last execution using the watcher API if I recall the failed execution / action should be in there.

You can also test it via the watcher screen in Kibana

I have a 30 days trial activated and I am trying to send sample email/slack message from the Watcher screen in Kibana.
I have been using Watcher on ES cloud stack and want to replicate the same on our self-managed stack on EC2.

My question was where can I find the Watcher logs or error details why an email or slack message is failing, or if I am missing some configs?

Thanks

As I mentioned above,

You can use the watcher API itself and try to execute the watcher you can look at the results and it will give you information on the error.

The errors will help you debug your config...

Create a watcher that will execute... 

PUT _watcher/watch/2db648f8-2564-4cde-8f0f-67a426f005a4
{
  "trigger": {
    "schedule": {
      "interval": "1m"
    }
  },
  "input": {
    "search": {
      "request": {
        "body": {
          "size": 0,
          "query": {


Execute it..... 

POST _watcher/watch/2db648f8-2564-4cde-8f0f-67a426f005a4/_execute

Look at the Result .... 

"actions" : [
    {
      "id" : "webhook_1",
      "type" : "webhook",
      "status" : "failure",
      "error" : {
        "root_cause" : [
          {
            "type" : "unknown_host_exception",
            "reason" : "api.bvader.net: Name or service not known"
          }
        ],
        "type" : "unknown_host_exception",
        "reason" : "api.bvader.net: Name or service not known"
      }
    },

You can also goto the watcher screen in Kibana and look at the failed executions

Hello,

this is the error I saw in log:

"error": {
         ...
"caused_by": {
            "type": "s_m_t_p_send_failed_exception",
            "reason": "554 Message rejected: Email address is not verified. The following identities failed the check in region US-EAST-1: elasticsearch@ip-x-x-x-x.ec2.internal\n"
          }

If you see the email settings in my first post, I think I am missing the 'from' email address which I have verified with AWS SES, where do I specify that please?

this is what I see related to slack action but I cannot find the clear error message or reason:

{
        "id": "slack_1",
        "type": "slack",
        "status": "failure",
        "slack": {
          "account": "watcher",
          "sent_messages": [
            {
              "status": "failure",
              "request": {
                "host": "hooks.slack.com",
                "port": -1,
                "scheme": "https",
                "method": "post",
                "headers": {
                  "Content-Type": "application/json; charset=UTF-8"
                },
                "body": "{\"username\":\"kibana\",\"text\":\"Test Alert: Watch [Disk Usage Alert] has exceeded the threshold\"}"
              },
              "response": {
                "status": 404,
                "headers": {
                  "date": [
                    "Thu, 01 Apr 2021 02:03:28 GMT"
                  ],
                  "server": [
                    "Apache"
                  ],
                  "x-envoy-upstream-service-time": [
                    "4"
                  ],
                  "transfer-encoding": [
                    "chunked"
                  ],
                  "vary": [
                    "Accept-Encoding"
                  ],
                  "x-frame-options": [
                    "SAMEORIGIN"
                  ],
                  "x-via": [
                    "envoy-www-iad-ikzh, haproxy-edge-iad-xu5y"
                  ],
                  "x-backend": [
                    "main_normal main_bedrock_normal_with_overflow main_canary_with_overflow main_bedrock_canary_with_overflow main_control_with_overflow main_bedrock_control_with_overflow"
                  ],
                  "x-slack-backend": [
                    "r"
                  ],
                  "strict-transport-security": [
                    "max-age=31536000; includeSubDomains; preload"
                  ],
                  "via": [
                    "envoy-www-iad-ikzh"
                  ],
                  "access-control-allow-origin": [
                    "*"
                  ],
                  "referrer-policy": [
                    "no-referrer"
                  ],
                  "content-type": [
                    "text/html"
                  ],
                  "x-slack-shared-secret-outcome": [
                    "shared-secret"
                  ],
                  "x-server": [
                    "slack-www-hhvm-main-iad-fu05"
                  ]
                },
                "body": "no_service"
              },
              "message": {
                "from": "kibana",
                "text": "Test Alert: Watch [Disk Usage Alert] has exceeded the threshold"
              }
            }
          ]
        }

Pretty sure It goes in the actual email action there is a from and reply_to see here not in the email setup

With respect to slack since you say it is working on Elastic Cloud, perhaps simply copy the settings from the elasticsearch.yml on Elastic Cloud.

I don't recognize that error

I am trying to set a from and reply-to but the service fails to start, I have tried single, double and without quotes.

xpack.notification.email:
     from: 'Watcher <no-reply@domain.com>'
     reply-to: 'build@domain.com'
     default_account: ses_account
     account:
	ses_account:
            smtp:
                auth: true
                starttls.enable: true
                starttls.required: true
                host: email-smtp.us-east-1.amazonaws.com
                port: 587
                user: AK*****VD53U

Elasticsearch service starts fine if I disable the 'from' and 'reply-to' line from .yml.

Hi @twilight :slight_smile:

It does not go in the account setup... per the link I sent above (Here again)from and reply_to go into the actual Email Action in the Watcher not in the account setup section in the elasticsearch.yml.

See Examples here

"actions": {
    "email_me": {
      "throttle_period": "10m",
      "email": {
        "from": "<from:email address>", <! ------ HERE
        "to": "<to:email address>",
        "subject": "Open Source Events",
        "body": {
          "html": "Found events matching Open Source: <ul>{{#ctx.payload.aggregations.group_by_city.buckets}}<          li>{{key}} ({{doc_count}})<ul>{{#group_by_event.buckets}}
          <li><a href=\"{{key}}\">{{get_latest.buckets.0.group_by_event_name.buckets.0.key}}</a>
          ({{doc_count}})</li>{{/group_by_event.buckets}}</ul></li>
          {{/ctx.payload.aggregations.group_by_city.buckets}}</ul>"
        }
      }
    }
  }

Thanks for the help with email action, it is fine now.

I exactly did the same, copied from cloud .yml to local .yml.

Cloud configs:

And local configs:
image

Webhook is stored in the same key in keystore:
xpack.notification.slack.account.watcher.secure_url

Not sure what to tell you on that perhaps.

404 is not found, sure you dont't have a typo in the slack key / value in the keystore. (It will never show in an error message). ir perhaps Network connectivity... you could try to curl the slack webhook url from the elasticsearch server

curl -X POST -H 'Content-type: application/json' --data '{"text":"Allow me to reintroduce myself!"}' YOUR_WEBHOOK_URL

Also you need to put that in the keystore on EACH elasticsearch node, unlike cloud where we do the propagation for you.

Here is my setup. I like verbose yaml to avoid mistakes.

xpack.notification.slack.account.monitoring.message_defaults.from: x-pack
xpack.notification.slack.account.monitoring.message_defaults.to: notifications
xpack.notification.slack.account.monitoring.message_defaults.icon: http://example.com/images/watcher-icon.jpg
xpack.notification.slack.account.monitoring.message_defaults.attachment.fallback: "X-Pack Notification"
xpack.notification.slack.account.monitoring.message_defaults.attachment.color: "#36a64f"
xpack.notification.slack.account.monitoring.message_defaults.attachment.title: "X-Pack Notification"
xpack.notification.slack.account.monitoring.message_defaults.attachment.title_link: "https://www.elastic.co/guide/en/x-pack/current/index.html"
xpack.notification.slack.account.monitoring.message_defaults.attachment.text: "One of your watches generated this notification."
xpack.notification.slack.account.monitoring.message_defaults.attachment.mrkdwn_in: "pretext, text"

It worked with my webhook url.

Is there any way to trace what webhook url it is trying to send to? or what url is set in keystore?
Althought the keystore list is showing a key with name 'xpack.notification.slack.account.watcher.secure_url' but not sure about the value.

Delete it and set it again... the value will never show anywhere... after all it IS a secret :slight_smile:

Remove it... Restart Elasticsearch
run the watcher it will fail.

Add it Back Carefully .... Restart Elasticsearch
and see what you get.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.