Exporting GCP stackdriver log to elasticsearch on premise

Hello team,
can we export logs from google cloud GCP (stackdriver) to an elasticsearch on premise?
I found the link below, but they assume that elasticsearch is established on elastic cloud, on my case I have a local cluster so there is no cloud. Id and cloud. Auth

I don't know if it's possible to ensure this integration with a local elasticsearch , if it's possible so how?
thanks in the advance

Where is your "Local Elastic Search Cluster" where in GCP, On Prem ... it is just a matter of network connectivity. Filebeat just needs to be able to reach Elasticsearch whether you manage it or it is on Elastic Cloud.

Instead of cloud.id and cloud.auth you will just use the normal Elasticsearch URL, username and password found here

eg.

output.elasticsearch:
  hosts: ["https://myEShost:9200"]
  username: "filebeat_writer"
  password: "YOUR_PASSWORD"

Hello,

I have the same case :cry: I am on the first part and i want to know how to transfer events from gcp stackdriver to a filebeat instance located on the my gcp cloud.

Thanks advance

hello stephen,
thanks for your answer.
normally i have the elasticsearch cluster outside the GCP, installed on a physical servers. but i can create a filebeat instance if it's needed.
so if i understand i don't need the cloud id and cloud.auth to export logs to my on premise elasticsearch?
they can be replaced by the http or https link to elasticsearch

The GCP article reference in the first post above does a very good job of step by step the only thing you need to do different Just use the full output.elasticsearch: as I showed, please look at the reference documentation.

You will need to use the gcp input in filebeat see the docs here :

There will need to be network connectivity.

Thank youu,
i will try & revert.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.