How can I stream data from ES to Azure Blob (not snapshot)?

Hi Guys,

My company is using Elasticsearch for one of our projects to collect event data from an app. As the number of users increased on the app so did the analytics needs. We are keeping data in for about 3 months for each index and storing snapshots on the cloud. We'd like to use Snowflake for analysis and Azure to store actual index data (not snapshots) so Snowflake can ingest them from there. I've previously written a python script to extract, compile and create window batches but would like to reduce the number of steps it take to get the data in Snowflake. It makes it easier for monitoring. Is there a way to use Elasticsearch services to do this for me? Something like compiling a .json.gz file every hour or every 100 MB then naming and dropping it to a Storage Account in Azure?

Further info: We only have Elasticsearch and Kibana and Elasticsearch is running on v. 5.6.7

There is nothing built in so I would recommend writing to other systems at the same time as you add data to Elasticsearch.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.