To do my backup and restore i am using curator 4.2.6 and sending the all the snapshots to azure blob storage account through azure cloud plugin 2.4.1 but what i need is
I need to send the snapshot start time and end time to azure table storage by this i can able to see whether the snapshot is success or not(i.e.if end time is there then it is success otherwise not)
Make Curator log in JSON with logformat => json in the client YAML configuration file.
Read this file with Logstash
Capture only the begin/end events by matching text with grok
Take the timestamps and send them to some other service which can put this into your cloud storage table.
For 4, you could write your own Logstash output plugin, if you have the time and resources. Otherwise, you can output to a file, or tcp, or the http_output plugin, or something like that and get the output where it needs to go.
Filebeat is a great way to do that, yes. Be sure to indicate in the config that the data is already in JSON.
I personally don't know how to input data into an Azure cloud storage table. Logstash can write to a file. It can send data over plain TCP. It can send data via http. Logstash has many other output formats. If there's a way you can find to use these to send to an Azure cloud storage table, then great. Otherwise, you'll have to find your own way to extend Logstash, or read from a file, or listen to TCP and get the information over there yourself.
Are you referring to the metadata Elasticsearch writes out to the repository? Curator doesn't write that. Curator talks to Elasticsearch via API, and Elasticsearch does anything else. As with the other statements here, you'll have to figure out how to get that data "the last mile" into your storage table on your own.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.