Has any one able to configure Local LLM like Ollama and able to connect to Kibana

Context:

  1. We have configured Ollama and configured Nginx as a reverse proxy to listen on port 443 and route to http://127.0.0.1:11434
    We have added a certificate trusted our company CA and we are able to connect to it over https and able to curl to the https://ollama-local.com

  2. We wanted to connect local Ollama URL to Kibana - able to configure AI Connector on Kibana

Seeing issues while testing:
Errors:

  1. below error when we try to test the AI Connector:
    The following error was found:
    an error occurred while running the action
    Details:
    Status code: undefined. Message: Unexpected API Error: SELF_SIGNED_CERT_IN_CHAIN - self-signed certificate in certificate chain

  2. Below error when we try to test from Observability-> AIAssistant
    Error: an error occurred while running the action - Status code: undefined. Message: Unexpected API Error: SELF_SIGNED_CERT_IN_CHAIN - self-signed certificate in certificate chain

Help us find the solution if you come across this issue.

The issue was resolved by updating the kibana.yml file with the following settings:
xpack.actions.customHostSettings:

Hope this will help few.