How create watcher email alerts and elasticsearch.yml file settings

please share step by step configure watcher and elasticsearch.yml file settings and smtp mail configurations .

thanks in advance .

Please see the documentation about configuring email accounts at https://www.elastic.co/guide/en/elasticsearch/reference/7.4/actions-email.html#configuring-email

If there are problems, please name concrete problems and we can try figure them out one-by-one.

hi spinscale,

i configured outlook account email successfully . but when i crearte threshold alert in watcher . click on 'test on fire email ' it shows " Failed to send e-mail to vishnuxxxx@outlook.com" please help what is the issue ?

Can you share the output of execute watch API for that particular watch?

i am trying to create threshold alert greater than 40000 send the mail .

Please use the console-tools and help to obtain the output of the execute watcher API as I mentioned in my last post. Unfortunately, screenshots are not helpful here.

Thank you!

Hi Spin,
after excution of my watcher get output error like this .
...................................................................................................................
{
"watch_id": "a84dc88d-9474-484b-84ff-19643d134f39",
"node": "-eDQJ7UPTw-d0mHO0F3ZoA",
"state": "executed",
"status": {
"state": {
"active": true,
"timestamp": "2019-11-05T01:54:17.823Z"
},
"last_checked": "2019-11-05T03:04:54.137Z",
"last_met_condition": "2019-11-05T03:04:54.137Z",
"actions": {
"email_1": {
"ack": {
"timestamp": "2019-11-05T01:54:17.823Z",
"state": "awaits_successful_execution"
},
"last_execution": {
"timestamp": "2019-11-05T03:04:54.137Z",
"successful": false,
"reason": ""
}
}
},
"execution_state": "executed",
"version": -1
},
"trigger_event": {
"type": "schedule",
"triggered_time": "2019-11-05T03:04:54.137Z",
"schedule": {
"scheduled_time": "2019-11-05T03:04:54.064Z"
}
},
"input": {
"search": {
"request": {
"search_type": "query_then_fetch",
"indices": [
".ml-anomalies-metricbeat_outages_ecs",
"metricbeat-"
],
"rest_total_hits_as_int": true,
"body": {
"size": 0,
"query": {
"bool": {
"filter": {
"range": {
"timestamp": {
"gte": "{{ctx.trigger.scheduled_time}}||-180d",
"lte": "{{ctx.trigger.scheduled_time}}",
"format": "strict_date_optional_time||epoch_millis"
}
}
}
}
}
}
}
}
},
"condition": {
"script": {
"source": "if (ctx.payload.hits.total > params.threshold) { return true; } return false;",
"lang": "painless",
"params": {
"threshold": 1000
}
}
},
"metadata": {
"name": "testing",
"watcherui": {
"trigger_interval_unit": "m",
"agg_type": "count",
"time_field": "timestamp",
"trigger_interval_size": 1,
"term_size": 5,
"time_window_unit": "d",
"threshold_comparator": ">",
"term_field": null,
"index": [
".ml-anomalies-metricbeat_outages_ecs",
"metricbeat-
"
],
"time_window_size": 180,
"threshold": 1000,
"agg_field": null
},
"xpack": {
"type": "threshold"
}
},
"result": {
"execution_time": "2019-11-05T03:04:54.137Z",
"execution_duration": 120179,
"input": {
"type": "search",
"status": "success",
"payload": {
"_shards": {
"total": 5,
"failed": 0,
"successful": 5,
"skipped": 0
},
"hits": {
"hits": ,
"total": 10000,
"max_score": null
},
"took": 1,
"timed_out": false
},
"search": {
"request": {
"search_type": "query_then_fetch",
"indices": [
".ml-anomalies-metricbeat_outages_ecs",
"metricbeat-*"
],
"rest_total_hits_as_int": true,
"body": {
"size": 0,
"query": {
"bool": {
"filter": {
"range": {
"timestamp": {
"gte": "2019-11-05T03:04:54.064Z||-180d",
"lte": "2019-11-05T03:04:54.064Z",
"format": "strict_date_optional_time||epoch_millis"
}
}
}
}
}
}
}
}
},
"condition": {
"type": "script",
"status": "success",
"met": true
},
"transform": {
"type": "script",
"status": "success",
"payload": {
"result": 10000
}
},
"actions": [
{
"id": "email_1",
"type": "email",
"status": "failure",
"error": {
"root_cause": [
{
"type": "messaging_exception",
"reason": "failed to send email with subject [Watch [testing] has exceeded the threshold] via account [gmail_account]"
}
],
"type": "messaging_exception",
"reason": "failed to send email with subject [Watch [testing] has exceeded the threshold] via account [gmail_account]",
"caused_by": {
"type": "messaging_exception",
"reason": "Exception reading response",
"caused_by": {
"type": "socket_timeout_exception",
"reason": "Read timed out"
}
}
}
}
]
},
"messages":
}

the interesting part is at the end, where the exception is listed, it seems that the email cannot be sent. Is it possible that you may have configured the wrong port to connect to your mailserver so that the TCP connection cannot be made?

--Alex

Hi spin,
please check the yml file and my below settings . i need to change anything on .yml file .. still same issue is there

(elasticsearch.yml file ):
xpack.watcher.enabled: true
xpack.notification.email.account:
gmail_account:
profile: gmail
smtp:
auth: true
starttls.enable: true
host: smtp.gmail.com
port: 587
user: mymail@gmail.com
pwd given in SMTP pwd:
bin/elasticsearch-keystore add xpack.notification.email.account.gmail_account.smtp.secure_password
smtp gmail connection also established:
[root@localhost elasticsearch]# telnet smtp.gmail.com 587
Trying 74.125.24.109...
Connected to smtp.gmail.com.
Escape character is '^]'.
...........................................

what Elasticsearch version are you on?

Also, can you put this somewhere (in a gist/pastebin) where the indentation does not get lost? Thank you!

Hi spin,

elasticsearch 7.0 version using.. may i know the where the issue is there it is port or .yml file

sorry, but 7.0 is not a concrete version.

I just want to be sure that the configuration is correct, thus I asked for it. Even though it looks ok, indendation in YAML is always tricky and I want to rule that out first.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.