Can't I send emails from my own server in elastic service?

I am not able to update the settings in Elasticsearch.yml file.
Not able to include any settings in .yml file ->

  • Elasticsearch - 'xpack.notification.email.html.sanitization': is not allowed

Are you running on cloud.elastic.co? If not, could you provide the full elasticsearch logs?

Please format your code, logs or configuration files using </> icon as explained in this guide and not the citation button. It will make your post more readable.

Or use markdown style like:

```
CODE
```

This is the icon to use if you are not using markdown format:

There's a live preview panel for exactly this reasons.

Lots of people read these forums, and many of them will simply skip over a post that is difficult to read, because it's just too large an investment of their time to try and follow a wall of badly formatted text.
If your goal is to get an answer to your questions, it's in your interest to make it as easy to read and understand as possible.

1 Like

I am using elastic cloud service on standard plan.
The watcher I created using ES rest url has the following action:

"send_email": {
            "email": {
                "body": {
                    "html": "<html><p style=\"font-size:60px\">Generated Alerts</p><p style=\"font-size:30px\">{{ctx.payload._value}}</p></html>"
                },
                "from": "admin@symai.com",
                "profile": "standard",
                "subject": "ALERT !!!",
                "to": [
                    "smyid@gmail.com"
                ]
            }
        }

I have a connector created through Kibana with the name "standard", and it has the email ID, SMTP host, port, username and password.
When I execute the watch, am getting the following error:

{
                    "error": {
                        "caused_by": {
                            "reason": "failed to connect, no password specified?",
                            "type": "authentication_failed_exception"
                        },
                        "reason": "failed to send email with subject [ ALERT !!!] via account [outlook_account]",
                        "root_cause": [
                            {
                                "reason": "failed to send email with subject [ALERT !!!] via account [outlook_account]",
                                "type": "messaging_exception"
                            }
                        ],
                        "type": "messaging_exception"
                    },
                    "id": "send_email",
                    "status": "failure",
                    "type": "email"
                }

Is it related to the original question?

I can see that this setting is allowed: Add Elasticsearch user settings | Elasticsearch Service Documentation | Elastic

The error message is about something else:

failed to connect, no password specified?

The documentation says: Watcher email action | Elasticsearch Reference [7.11] | Elastic

Use the email action to send email notifications. To send email, you must configure at least one email account in elasticsearch.yml .

Did you define this?

Hi Folks,

I think there is a little confusion on Elastic Cloud with watcher, there are restrictions and some special setup.

See Here and Here

You cannot use your own SMTP server. All emails are sent through our servers, and the recipient must be whitelisted.

Watches do not use the connectors from the Kibana Alerts and Actions section

Watcher and Kibana Alerts and Actions are two separate capabilities.

Watcher is the legacy API version of alerts

Kibana Alerts and Actions are the new alerting framework.

I am not able to add the outlook account in Elasticsearch.yml. I was able to send emails earlier. The settings are not being allowed in "user settings" page in Elasticsearch Service cloud console.

I also tried adding a connector in Kibana, providing smtp settings.

Can I create Kibana alerts through code? Through a REST endpoint? I am stuck because of this issue. We actually went to a paid version of elastic for watcher.

In Elastic Cloud you can use watcher you just

For Watcher.
a) You Do not define your own SMTP server the emails are automatically sent through our cloud email server.

B) You need to add / whitelist the emails per the instructions I linked to above.

With respect to the Kibana Alerts and Actions, the REST API is in development and shook be available in future exact date is not available

Personally I would start with the Kibana Alerts and Actions as that is where we are focusing the future on.

If you need REST API day one I guess you will need to start with watcher which works fine as long as you follow the documentation.

If I give the throttling period, will it be taken into account, when emails are being sent from your server?
When I trigger the _execute endpoint, I see that the emails are not being sent. I see "success" in the watch execution response, but I get no email.

Per this restriction you can throttle per watcher not at the SMTP server level.

Watcher
"
Changing the default throttle period is not possible. You can specify a throttle period per watch, however.

You cannot use your own SMTP server. All emails are sent through our servers, and the recipient must be whitelisted."

I highly recommend looking at the Kibana Alerts there is more sophisticated throttling like Alert Only on State Change etc, so only 1 Alert FIRED when the condition is met then 1 RECOVERED alert when the Alert recovers. The API should come soon.

Without API, I can't use this feature to my advantage. Our use case won't work with Kibana alerts. We can't keep using UI to create watches.

I whitelisted my email address, and triggered the watch execution. I am getting this error.

{
                    "error": {
                        "reason": "Illegal address",
                        "root_cause": [
                            {
                                "reason": "Illegal address",
                                "type": "address_exception"
                            }
                        ],
                        "type": "address_exception"
                    },
                    "id": "send_email",
                    "status": "failure",
                    "type": "email"
                }

It says firing, but I have not received any email. It has been so all day today. Is this feature even working at all in Elasticsearch cloud?
I had no problems with it when I downloaded and deployed the free version.

Yes Watcher Works in Elastic Cloud.

Can you share the whole watcher please?

Can you do a simple test through the Kibana GUI I just did and tested the watcher and it sent me and email.

I would do a test like this first... and send a test email.

Send the test email, Then you can copy the watcher to see how it is setup

...

  "actions": {
    "email_1": {
      "email": {
        "profile": "standard",
        "to": [
          "myemail@mydomain.com"
        ],
        "subject": "Watch [{{ctx.metadata.name}}] has exceeded the threshold",
        "body": {
          "text": "This is the watcher"
        }
      }
    }
  }
}
  1. My watcher was created using the watcher API endpoints and not through Kibana.
  2. Even if watchers done through Kibana UI work, it won't be sufficient for my use case. It is clear that watchers created through API and from Kibana are quite isolated. For example, connectors created through UI have no effect on watchers created through APIs.

Watcher execution output:

{
  "watch_id": "testing-stagingid_default_watch_create_alert_abb12761fb9d45321df4e016c8a5b4a6ae2949df",
  "node": "iXU2NqOSTRyGCXvc_D25Aw",
  "state": "executed",
  "user": "elastic",
  "status": {
    "state": {
      "active": true,
      "timestamp": "2021-03-23T15:46:12.678Z"
    },
    "last_checked": "2021-03-23T15:48:13.070Z",
    "last_met_condition": "2021-03-23T15:48:13.070Z",
    "actions": {
      "send_email": {
        "ack": {
          "timestamp": "2021-03-23T15:48:13.070Z",
          "state": "ackable"
        },
        "last_execution": {
          "timestamp": "2021-03-23T15:48:13.070Z",
          "successful": true
        },
        "last_successful_execution": {
          "timestamp": "2021-03-23T15:48:13.070Z",
          "successful": true
        }
      },
      "notify_slack": {
        "ack": {
          "timestamp": "2021-03-23T15:48:13.070Z",
          "state": "ackable"
        },
        "last_execution": {
          "timestamp": "2021-03-23T15:48:13.070Z",
          "successful": true
        },
        "last_successful_execution": {
          "timestamp": "2021-03-23T15:48:13.070Z",
          "successful": true
        }
      }
    },
    "execution_state": "executed",
    "version": -1
  },
  "trigger_event": {
    "type": "schedule",
    "triggered_time": "2021-03-23T15:48:13.070Z",
    "schedule": {
      "scheduled_time": "2021-03-23T15:48:12.680Z"
    }
  },
  "input": {
    "http": {
      "request": {
        "scheme": "https",
        "host": "hosting",
        "port": 443,
        "method": "post",
        "path": "alerts/_search",
        "params": {},
        "headers": {},
        "auth": {
          "basic": {
            "username": "rer456",
            "password": "::es_redacted::"
          }
        },
        "body": "{\"query\": {\"bool\": {\"must\": [{\"range\": {\"data.started_at\": {\"gte\": \"now-120s\", \"lt\": \"now\"}}}]}}}"
      },
      "response_content_type": "json"
    }
  },
  "condition": {
    "compare": {
      "ctx.payload.hits.total.value": {
        "gt": 0
      }
    }
  },
  "metadata": {
    "mailing_list": [
      "sjoe@siac.com"
    ],
    "created_at": "2021-03-23T21:16:12.599136",
    "asset_id_list": null,
    "slack_channels": [
      "test-notifications"
    ],
    "env": "staging"
  },
  "result": {
    "execution_time": "2021-03-23T15:48:13.070Z",
    "execution_duration": 956,
    "input": {
      "type": "http",
      "status": "success",
      "payload": {
        
        "_shards": {
          "total": 1,
          "successful": 1,
          "skipped": 0,
          "failed": 0
        },
        "hits": {},
        "took": 3,
        "timed_out": false,
        "_status_code": 200
      },
      "http": {
        "request": {
          "host": "hosting",
          "port": 443,
          "scheme": "https",
          "method": "post",
          "path": "testing-staging-unified_alerts/_search",
          "auth": {
            "basic": {
              "username": "alert-readonly",
              "password": "::es_redacted::"
            }
          },
          "body": "{\"query\": {\"bool\": {\"must\": [{\"range\": {\"data.started_at\": {\"gte\": \"now-120s\", \"lt\": \"now\"}}}]}}}"
        },
        "status_code": 200
      }
    },
    "condition": {
      "type": "compare",
      "status": "success",
      "met": true,
      "compare": {
        "resolved_values": {
          "ctx.payload.hits.total.value": 1
        }
      }
    },
    "transform": {
      "type": "script",
      "status": "success",
      "payload": {
        "_value": "<ul>2021-03-23T15:47:01.000Z: Asset 5730 breached threshold rule anomaly_score{assetId=152} > 0 for 20 seconds</ul>"
      }
    },
    "actions": [
      {
        "id": "send_email",
        "type": "email",
        "status": "success",
        "email": {
          "account": "work",
          "message": {
            "id": "send_email_default_watch_create_alert_abb12761fb9d45321df4e016c8a5b4a6ae2949df_b5a915b5-8aea-4d92-995b-1dbe778e7a2e-2021-03-23T15:48:13.070758427Z_18",
            "from": "eu-admin@siac.com",
            "sent_date": "2021-03-23T15:48:13.111036674Z",
            "to": [
              "sjoe@siac.com"
            ],
            "subject": "eu ALERT !!!",
            "body": {
              "html": "<p>Generated Alerts</p><p></p><ul><li>2021-03-23T15:47:01.000Z: Asset 5730 breached threshold rule anomaly_score{assetId&#61;152} &gt; 0 for 20 seconds</li></ul>"
            }
          }
        }
      },
      {
        "id": "notify_slack",
        "type": "slack",
        "status": "success",
        "slack": {
          "account": "monitoring",
          "sent_messages": [
            {
              "status": "success",
              "to": "test-notifications",
              "message": {
                "from": "eu-notifications",
                "text": "<ul>2021-03-23T15:47:01.000Z:He breached threshold criterion</ul>"
              }
            }
          ]
        }
      }
    ]
  },
  "messages": []
}

Watcher was created as follows:

{
    "actions": {
        "notify_slack": {
            "slack": {
                "account": "slack_channel",
                "message": {
                    "from": "notifications",
                    "text": "Generated Alerts {{ctx.payload._value}}",
                    "to": [
                        "test-channel"
                    ]
                }
            }
        },
        "send_email": {
            "email": {
                "body": {
                    "html": "<html><p style=\"font-size:60px\">Generated Message</p><p style=\"font-size:30px\">{{ctx.payload._value}}</p></html>"
                },
                "from": "eu@sai.com",
                "profile": "standard",
                "subject": "EUREKA ALERT !!!",
                "to": [
                    "sjoe@sai.com"
                ]
            }
        }
    },
    "condition": {
        "compare": {
            "ctx.payload.hits.total.value": {
                "gt": 0
            }
        }
    },
    "input": {
        "http": {
            "request": {
                "auth": {
                    "basic": {
                        "password": "::es_redacted::",
                        "username": "readonly"
                    }
                },
                "body": "{\"query\": {\"bool\": {\"must\": [{\"range\": {\"data.started_at\": {\"gte\": \"now-120s\", \"lt\": \"now\"}}}]}}}",
                "headers": {},
                "host": "hosting",
                "method": "post",
                "params": {},
                "path": "alerts/_search",
                "port": 443,
                "scheme": "https"
            },
            "response_content_type": "json"
        }
    },
    "metadata": {
        "asset_id_list": null,
        "created_at": "2021-03-23T21:35:23.732600",
        "env": "staging",
        "mailing_list": [
            "sjoe@sai.com"
        ],
        "slack_channels": [
            "test-notifications"
        ]
    },
    "transform": {
        "script": {
            "lang": "painless",
            "source": ""
        }
    },
    "trigger": {
        "schedule": {
            "interval": "120s"
        }
    }
}

To add more info, I did try test emails from Kibana too, they worked. It's clearly the problem of sending emails from watcher.

Hi @sireesha_m you are not understanding my point.

I am suggesting that you test a Simple Watcher through the Kibana UI first to prove to yourself that the email works. That uses the watched syntax / API ... so yes watcher works.

THEN you should use the email action template created by that watcher to create the email actions you want VIA the API.

I can see that your email action is NOT correct!

There is no "account": "work"

You have...

      {
        "id": "send_email",
        "type": "email",
        "status": "success",
        "email": {
          "account": "work",  <----- This is not correct take this out
          "message": {
            "id": "send_email_default_watch_create_alert_abb12761fb9d45321df4e016c8a5b4a6ae2949df_b5a915b5-8aea-4d92-995b-1dbe778e7a2e-2021-03-23T15:48:13.070758427Z_18",
            "from": "eu-admin@siac.com",
            "sent_date": "2021-03-23T15:48:13.111036674Z",
            "to": [
              "sjoe@siac.com"
            ],
            "subject": "eu ALERT !!!",
            "body": {
              "html": "<p>Generated Alerts</p><p></p><ul><li>2021-03-23T15:47:01.000Z: Asset 5730 breached threshold rule anomaly_score{assetId&#61;152} &gt; 0 for 20 seconds</li></ul>"
            }
          }
        }
      }

Please try this structure with your data, note no account (there is a default setup by ESS)

  "actions": {
    "email_1": {
      "email": {
        "profile": "standard",
        "to": [
          "myemail@mydoamin.com"
        ],
        "subject": "Watch [{{ctx.metadata.name}}] has exceeded the threshold",
        "body": {
          "text": "This is the watcher"
        }
      }
    }
  }

"Work" was present in the watcher execution output. I mentioned the same, the next json I gave contains the actual watcher.

Ok Good,

Are you getting the alert in the Slack Channel? at the same time you you expect the email?

Yes, correct.