Getting Error in Creating a Workflow

Hello,
I'm working on setting up a workflow in Elastic Stack where, upon the ingestion of a log into an index, the log is sent to a Large Language Model (LLM) for processing. The LLM would then return a response, which I want to store in a separate index for further analysis.
Here’s what I’ve set up so far:

  1. ESQL Query and Rule: I’ve created an ESQL query to detect when a new log is ingested. This triggers an alert, and the log data is passed to an action that interfaces with the LLM.
  2. LLM Integration: The LLM receives the input, processes it, and sends a response back.

The issue I’m encountering is that while the LLM seems to be processing the input and sending a response, I’m not sure where that response is stored within Kibana or how to view it. I can see the context and the log data in the context.hits, but once the response comes back from the LLM, I’m unable to locate or manage it in Kibana.
Could someone help clarify where the LLM’s response should be stored, or what I might be missing in the setup to make this process more transparent within Kibana?
Thanks in advance!