Getting Repeated mail for same record in elasticsearch while am using watcher

alerting

(Gaurav Harsola) #1

Hi,

I have created an index using command

http://localhost:9200/event
PUT
{
"settings" : {
"number_of_shards" : 1
},
"mappings" :{
"event": {
"properties": {
"eventId": {
"type": "integer"
},
"eventName": {
"type": "string"
},
"eventDescription": {
"type": "string"
},
"eventCategory": {
"type": "string"
},
"eventType": {
"type": "string"
}

    }
}

}}

After that i created a watcher :-

PUT http://localhost:9200/_watcher/watch/event_critical_watch
{
"trigger": {
"schedule": {
"interval": "60s"
}
},
"input": {
"search": {
"request": {
"indices": [
"event"
],
"body": {
"query": {
"match": {
"eventCategory": "CRITICAL"
}
}
}
}
}
},
"condition": {
"compare": {
"ctx.payload.hits.total": {
"gt": 0
}
}
},
"actions": {
"email_admin": {
"email": {
"to": "'xyz@gmail.com'",
"subject": "{{ctx.watch_id}} executed",
"body": "{{ctx.watch_id}} executed with {{ctx.payload.hits.total}} hits"
}
}
}
}

After that in elastisearc.yml i have made necessary changes :--
watcher.actions.email.service.account:
gmail:
profile: gmail
smtp:
auth: true
starttls.enable: true
host: smtp.gmail.com
port: 587
user: your-email@gmail.com
password: your-password

after that when i create a simple event on event index with action as 'CRITICAL'
Here is a sample event.

?
PUT http://localhost:9200/event/event/1
{
"eventId" : 1,
"eventName" : "3 failed login attempts",
"eventDescription" : "System has detected 3 failed login attempts",
"eventCategory" : "CRITICAL",
"eventType" : "LOG"
}

It send me a mail on my mail id ..but the issue is i am getting same mail for the same record for every 60sec ..Can't we stop sending mail for the record which we had sent earlier..this is really annoying.Please help me .


(Alexander Reelsen) #2

Hey Gaurav,

watcher is doing exactly, what you told it to do: searching for any entry with the configured loglevel. If you dont want that, you have to add a timestamp to the event and always filter for the last minute (depending on your interval).

See https://www.elastic.co/guide/en/elasticsearch/guide/current/_ranges.html
and https://www.elastic.co/guide/en/elasticsearch/reference/2.2/query-dsl-range-query.html

hope this helps.

--Alex


(Gaurav Harsola) #3

Hey Alexander,

Thanks for quick response !!

In my index , i dont have any time-stamp field . So if you dont mind can you help me out how to modify my code so that it dont send mail again and again for the same record.

Thanks
Gaurav


(Alexander Reelsen) #4

Hey Gaurav,

if you need to filter your queries over time, you need to add a timestamp field. You could do this using the _timestamp field. However this is deprecated and might be removed in future releases. With the upcoming Elasticsearch 5.0 you could use an ingest pipeline and have a processor that adds a new field and sets it to {{now}}.

Hope this helps.

--Alex


(Gaurav Harsola) #5

Hey Alexender,

I am still not able to solve the problem .Can u help me out little bit more by filtering here based on time stamp for above problem??..while creating index Should we need to have time stamp field ,if we want to do filter based on timestamp.

Thanks
Gaurav


(Alexander Reelsen) #6

Hey,

yes you need a time based field in your documents, that you are indexing (or use the timestamp field mapper). Then you can use a time based range query to filter, see the documentation about range queries

If you have problems, please provide fully fledged examples as mentioned in our help guidelines, so other people can chime in and follow your problem by concrete examples.

--Alex


(system) #7