Prevent logs data loss using Cloudflare integration powered by Elastic Agent

Using the CludFlare integration, in case of Elastic failure, will the logs that have been sent to elastic be lost?

Or there will be queues (or something similar) from which Elastic will restart once it is reset.

Or alternatively will it recover the lost data directly from CloudFlare?

Thank you.
Regards.

What do u mean?? Can u provide context??

The underlying Beats themselves have internal tracking and queues to store data if they cannot contact Elasticsearch.

How Filebeat works | Filebeat Reference [8.4] | Elastic is one example of how that works.

Hi legoguy1000,

I have to push/pull Cloudflare logs to an Elastic Cloud.

So I thought to use the Cloudflare integration.
So in case this solution worked, my doubts were about data loss.

Thank you.
Regards
Giacomo

Hi warkolm,

considering that the hypothetical architecture will be: Cloudflare <--> Elastic Agent (integration Cloudflare) <--> Elastic Cloud. Do you think it is possible and that your answer also applies to this architecture?

Thank you
Regards

So it's been found that the current integration using the logpull api can fall behind due to the extreme amount of data cloudflare produces. It may fall behind but shouldn't lose/drop data but I can't be 100%. The log push api long term should be more effective.

Hi, as actually the integration doesn't seem to work, but I can't get the evidence that it is due to the large number of logs. Do you know where to look for logs explaining the reason for the failure? Thank you.

In the fleet UI, go to the agent and look at the logs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.