Email notification for Alerting/reporting

Hello there,

I need to set up the email notification for watchers/alerts and reporting with x-pack on AWS cloud.

I read the docs below:

https://www.elastic.co/guide/en/elastic-stack-overview/6.3/actions-email.html#amazon-ses

And set up the email notification account in elasticsearch.yml as below:

xpack.notification.email.account:
ses_account:
smtp:
auth: true
starttls.enable: true
starttls.required: true
host: email-smtp.us-east-1.amazonaws.com
port: 465
user:
password:

But I'm getting the following error in elasticsearch log:

===========================
[2018-10-02T14:56:41,865][ERROR][o.e.x.w.a.e.ExecutableEmailAction] [hlsoelse1a-02] failed to execute action [efad0a6c-fc1a-4079-9448-5543a800a75a/email_1]
javax.mail.MessagingException: failed to send email with subject [Watch [MetricBeatFilesystemUsed] has exceeded the threshold] via account [ses_account]
...

Caused by: com.sun.mail.util.MailConnectException: Couldn't connect to host, port: email-smtp.us-east-1.amazonaws.com, 465; timeout 120000
...........

I didn't set up the TLS on elasticsearch nodes yet...

Did I miss anything or anything incorrect?

Please help, thank you very much

Li

Try setting the port to 25.

I set the port to 25, the error messages are gone from the elasticsearch log, but still I don't receive any email notification from watcher/alerts. The Watchers show everything goes fine.

The execution output is like below and except to set up the smtp email account in elasticsearch.yml, is there anything else that I might be missing?

Please help.

{
"watch_id": "a724e8b2-3bd8-4c4c-b6bb-01b2548d77d4",
"node": "HIzDm73kRKidQ4M1M20IgQ",
"state": "execution_not_needed",
"status": {
"state": {
"active": true,
"timestamp": "2018-10-03T06:19:05.129Z"
},
"last_checked": "2018-10-03T19:34:18.547Z",
"actions": {
"email_1": {
"ack": {
"timestamp": "2018-10-03T06:19:05.129Z",
"state": "awaits_successful_execution"
}
}
},
"execution_state": "execution_not_needed",
"version": -1
},
"trigger_event": {
"type": "schedule",
"triggered_time": "2018-10-03T19:34:18.547Z",
"schedule": {
"scheduled_time": "2018-10-03T19:34:18.523Z"
}
},
"input": {
"search": {
"request": {
"search_type": "query_then_fetch",
"indices": [
"metricbeat-"
],
"types": [],
"body": {
"size": 0,
"query": {
"bool": {
"filter": {
"range": {
"@timestamp": {
"gte": "{{ctx.trigger.scheduled_time}}||-5m",
"lte": "{{ctx.trigger.scheduled_time}}",
"format": "strict_date_optional_time||epoch_millis"
}
}
}
}
},
"aggs": {
"metricAgg": {
"max": {
"field": "system.filesystem.used.pct"
}
}
}
}
}
}
},
"condition": {
"script": {
"source": "if (ctx.payload.aggregations.metricAgg.value > params.threshold) { return true; } return false;",
"lang": "painless",
"params": {
"threshold": 1
}
}
},
"metadata": {
"name": "watcher-testing",
"watcherui": {
"trigger_interval_unit": "m",
"agg_type": "max",
"time_field": "@timestamp",
"trigger_interval_size": 1,
"term_size": 5,
"time_window_unit": "m",
"threshold_comparator": ">",
"term_field": null,
"index": [
"metricbeat-
"
],
"time_window_size": 5,
"threshold": 1,
"agg_field": "system.filesystem.used.pct"
},
"xpack": {
"type": "threshold"
}
},
"result": {
"execution_time": "2018-10-03T19:34:18.547Z",
"execution_duration": 7,
"input": {
"type": "search",
"status": "success",
"payload": {
"_shards": {
"total": 42,
"failed": 0,
"successful": 42,
"skipped": 0
},
"hits": {
"hits": [],
"total": 13498,
"max_score": 0
},
"took": 6,
"timed_out": false,
"aggregations": {
"metricAgg": {
"value": 0.9520000000000001
}
}
},
"search": {
"request": {
"search_type": "query_then_fetch",
"indices": [
"metricbeat-*"
],
"types": [],
"body": {
"size": 0,
"query": {
"bool": {
"filter": {
"range": {
"@timestamp": {
"gte": "2018-10-03T19:34:18.523Z||-5m",
"lte": "2018-10-03T19:34:18.523Z",
"format": "strict_date_optional_time||epoch_millis"
}
}
}
}
},
"aggs": {
"metricAgg": {
"max": {
"field": "system.filesystem.used.pct"
}
}
}
}
}
}
},
"condition": {
"type": "script",
"status": "success",
"met": false
},
"actions": []
},
"messages": []
}

please take the time to properly format your messages using markdown, especially json snippets. those are pretty much impossible to read otherwise. Thanks a lot!

in your last pasted snippet the condition was not met, thus no action was triggered, see this output

"condition": {
"type": "script",
"status": "success",
"met": false
}

Hello there,

Now the condition was met and the email was not sent out...

Please see the following and advise, thank you


{
"watch_id": "a724e8b2-3bd8-4c4c-b6bb-01b2548d77d4",
"node": "PSBN9nLqQe62KJmr1-oRMw",
"state": "executed",
"status": {
"state": {
"active": true,
"timestamp": "2018-10-04T22:42:24.066Z"
},
"last_checked": "2018-10-04T22:50:00.348Z",
"last_met_condition": "2018-10-04T22:50:00.348Z",
"actions": {
"email_1": {
"ack": {
"timestamp": "2018-10-04T22:42:24.066Z",
"state": "awaits_successful_execution"
},
"last_execution": {
"timestamp": "2018-10-04T22:50:00.348Z",
"successful": false,
"reason": ""
}
}
},
"execution_state": "executed",
"version": -1
},
"trigger_event": {
"type": "schedule",
"triggered_time": "2018-10-04T22:50:00.348Z",
"schedule": {
"scheduled_time": "2018-10-04T22:50:00.323Z"
}
},
"input": {
"search": {
"request": {
"search_type": "query_then_fetch",
"indices": [
"metricbeat-"
],
"types": [],
"body": {
"size": 0,
"query": {
"bool": {
"filter": {
"range": {
"@timestamp": {
"gte": "{{ctx.trigger.scheduled_time}}||-5m",
"lte": "{{ctx.trigger.scheduled_time}}",
"format": "strict_date_optional_time||epoch_millis"
}
}
}
}
},
"aggs": {
"metricAgg": {
"max": {
"field": "system.filesystem.used.pct"
}
}
}
}
}
}
},
"condition": {
"script": {
"source": "if (ctx.payload.aggregations.metricAgg.value > params.threshold) { return true; } return false;",
"lang": "painless",
"params": {
"threshold": 0.3
}
}
},
"metadata": {
"name": "watcher-testing",
"watcherui": {
"trigger_interval_unit": "m",
"agg_type": "max",
"time_field": "@timestamp",
"trigger_interval_size": 1,
"term_size": 5,
"time_window_unit": "m",
"threshold_comparator": ">",
"term_field": null,
"index": [
"metricbeat-
"
],
"time_window_size": 5,
"threshold": 0.3,
"agg_field": "system.filesystem.used.pct"
},
"xpack": {
"type": "threshold"
}
},
"result": {
"execution_time": "2018-10-04T22:50:00.348Z",
"execution_duration": 120079,
"input": {
"type": "search",
"status": "success",
"payload": {
"_shards": {
"total": 44,
"failed": 0,
"successful": 44,
"skipped": 0
},
"hits": {
"hits": [],
"total": 19352,
"max_score": 0
},
"took": 5,
"timed_out": false,
"aggregations": {
"metricAgg": {
"value": 0.988
}
}
},
"search": {
"request": {
"search_type": "query_then_fetch",
"indices": [
"metricbeat-*"
],
"types": [],
"body": {
"size": 0,
"query": {
"bool": {
"filter": {
"range": {
"@timestamp": {
"gte": "2018-10-04T22:50:00.323Z||-5m",
"lte": "2018-10-04T22:50:00.323Z",
"format": "strict_date_optional_time||epoch_millis"
}
}
}
}
},
"aggs": {
"metricAgg": {
"max": {
"field": "system.filesystem.used.pct"
}
}
}
}
}
}
},
"condition": {
"type": "script",
"status": "success",
"met": true
},
"transform": {
"type": "script",
"status": "success",
"payload": {
"result": 0.988
}
},
"actions": [
{
"id": "email_1",
"type": "email",
"status": "failure",
"error": {
"root_cause": [
{
"type": "messaging_exception",
"reason": "failed to send email with subject [Watch [watcher-testing] has exceeded the threshold] via account [ses_account]"
}
],
"type": "messaging_exception",
"reason": "failed to send email with subject [Watch [watcher-testing] has exceeded the threshold] via account [ses_account]",
"caused_by": {
"type": "mail_connect_exception",
"reason": "Couldn't connect to host, port: email-smtp.us-east-1.amazonaws.com, 25; timeout 120000",
"caused_by": {
"type": "socket_timeout_exception",
"reason": "connect timed out"
}
}
}
}
]
},
"messages": []
}

please take the time to go through the JSON you provided, read and interpret it. You will see the following lines at the end

This indicates that elasticsearch was not able to connect to the AWS infrastructure - potentially not being able to connect to it due to firewalls in between. Can you try to manually connect from the elasticsearch host to the amazon mailservers and see if that works?

This one has been fixed, it is on AWS side... thanks a lot

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.