Unable to locate data ingestion logs to debug GCP Dataflow errors

Hello, I'm trying to debug an issue with a GCP Dataflow job which inserts documents into an index.

I get the following errors from GCP Dataflow:

java.io.IOException: Error writing to Elasticsearch, some elements could not be inserted:  
com.google.cloud.teleport.v2.elasticsearch.utils.ElasticsearchIO.checkForErrors(ElasticsearchIO.java:231)  
com.google.cloud.teleport.v2.elasticsearch.utils.ElasticsearchIO$Write$WriteFn.flushBatch(ElasticsearchIO.java:1483)
com.google.cloud.teleport.v2.elasticsearch.utils.ElasticsearchIO$Write$WriteFn.processElement(ElasticsearchIO.java:1442)

I've been trying to follow this thread to debug my issue, but I can't find any data logs in Observability - Logs. How can I find the log messages that show either the documents being inserted in my index or the error messages when a document is not inserted?

I have logging enabled on my instances, and I've even turned on audit logging for Elasticsearch and kibana. I've also tried streaming my logs and running a successful dataflow job that inserts data, but I don't see any log statements that show up as a result of the job.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.