As per the documentation it says the implementation flow remains the same for "The integration works for all users, regardless of whether you are using the Elastic Stack on Elastic Cloud, Elastic Cloud in the Google Cloud Marketplace, or a self-managed environment."
However when trying to use this dataflow template in big query dataflows it shows that the field Cloud API Key is mandatory. In that case how can we use this dataflow template for Self-Managed Instances with basic auth enabled. If anyone has been able to get this working please help.
This is the API Key, it is not related to Elastic Cloud.
You can create an API Key on self-managed without any problems, just go on Kibana, Stack Management, Security, API Keys.
The guide you shared has a link to the part of the documentation on how you can create an API key.
Fill in the required parameters, including your Cloud ID and Base64-encoded API Key for Elasticsearch. Cloud ID can be found from Elastic Cloud UI as shown below. API Key can be created using theCreate API key API.
You just need to create an API Key and use it in that field.
In this template it has a mandatory parameter for API key. however in the code it looks like its either api key or username/password. Please help on how to process with just using username/password as I cannot use the API key approach.
Am using the OSS build which is mapped behind our internal proxy using which we are creating the security layer and currently in our security layer we only support the Basic auth and don't yet have an API key mech implemented. Hence want to see if there is a way to get it working with Basic Auth.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.