Alerts mail to us if curator job doesn't work fine?

HI,

I Created periodic backups using curator and sends the status report to logs.txt file . But my question is:

  1. SO, every day i have to see the logs.txt file or azure storage account to check whether periodic backups were created or not. Suppose the curator job has failed then how can i able to know without seeing logs.txt and azure storage account . Is there any alerting mail system in ES to the admin saying that the snapshot job is failed .

Is this possible in Elasticsearch?

Thanks

There are a few ways to do this. One of the big ones is that the Curator log can be in json format:

logging:
  loglevel: INFO
  logfile: /path/to/curator.log
  logformat: json

With this, the curator logs can be sent to Logstash, or Elasticsearch (including via an ingest pipeline). Once the events are there, you can create Alerts (if using x-pack) for job completion. If you're not using x-pack, you could write your own parser and/or search to look for that, and alert yourself.

Thanks @theuntergeek

I want to send data to ES or logstash and write a code in such a way it will catch "job completed" string from the curator logs json.
For alerting myself also i have to search in ES right then how can i say it is alerting me because i am going and searching in the ES or is there any other way?

Correct me if i am wrong.

Can you please say what are other easy ways to do this?

THANKS

Watcher/Alerting would do it.

These would be more manually scripted ways of just tailing the log file and looking for the expected strings, and sending you an email. Potentially, you could use Logstash to read these in, and use the email output plugin to send a notification.

Thanks @theuntergeek

i had implemented these .

Now i want to send the curator.log file through ingest pipeline to Elasticsearch Curator Logs index , i went though the ingest pipline documentation in that i didn't find some steps like

Usually in logstash we will use the below config to send data to ES .

input {
  file {
    path => "C:\Users\thunder\Desktop\curator.log"
    codec => json
    start_position => "beginning"
   
  }
}
output {
elasticsearch {
    hosts => "http://localhost:9200/"
    index => "data"
}
stdout{
codec => rubydebug
}
}

1)Like the same way as above how can i send the logs to ES using ingest pipeline?
2) Do i have to do this method manually everytime to send data to ES but i want my curator logs to be sent to ES automatically when curator generate logs ? Does this can be done without logstash ?
3) Is this possible in Elasticsearch 2.4.1?

FYI - I am not using Filebeat

Any help is highly appreciated

There is currently no way to send Curator logs to Elasticsearch to an ingest pipeline of which I am aware, other than Beats. You would have to code your own, or you can send with Logstash, as you have configured above.

Thanks

Okay then i will install Filebeat and try it . Once configured the the curator log data will go directly to ES index right not need to send manually every time?

Do i have to do above method manually everytime to send data to ES but i want my curator logs to be sent to ES automatically when curator generate logs. Is it possible using logstash ?

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.