I Created periodic backups using curator and sends the status report to logs.txt file . But my question is:
SO, every day i have to see the logs.txt file or azure storage account to check whether periodic backups were created or not. Suppose the curator job has failed then how can i able to know without seeing logs.txt and azure storage account . Is there any alerting mail system in ES to the admin saying that the snapshot job is failed .
There are a few ways to do this. One of the big ones is that the Curator log can be in json format:
logging:
loglevel: INFO
logfile: /path/to/curator.log
logformat: json
With this, the curator logs can be sent to Logstash, or Elasticsearch (including via an ingest pipeline). Once the events are there, you can create Alerts (if using x-pack) for job completion. If you're not using x-pack, you could write your own parser and/or search to look for that, and alert yourself.
I want to send data to ES or logstash and write a code in such a way it will catch "job completed" string from the curator logs json.
For alerting myself also i have to search in ES right then how can i say it is alerting me because i am going and searching in the ES or is there any other way?
Correct me if i am wrong.
Can you please say what are other easy ways to do this?
These would be more manually scripted ways of just tailing the log file and looking for the expected strings, and sending you an email. Potentially, you could use Logstash to read these in, and use the email output plugin to send a notification.
Now i want to send the curator.log file through ingest pipeline to Elasticsearch Curator Logs index , i went though the ingest pipline documentation in that i didn't find some steps like
Usually in logstash we will use the below config to send data to ES .
1)Like the same way as above how can i send the logs to ES using ingest pipeline?
2) Do i have to do this method manually everytime to send data to ES but i want my curator logs to be sent to ES automatically when curator generate logs ? Does this can be done without logstash ?
3) Is this possible in Elasticsearch 2.4.1?
There is currently no way to send Curator logs to Elasticsearch to an ingest pipeline of which I am aware, other than Beats. You would have to code your own, or you can send with Logstash, as you have configured above.
Okay then i will install Filebeat and try it . Once configured the the curator log data will go directly to ES index right not need to send manually every time?
Do i have to do above method manually everytime to send data to ES but i want my curator logs to be sent to ES automatically when curator generate logs. Is it possible using logstash ?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.