I’m trying to follow the recent lab for Multi agent with Langgraph. I’m having many problems but the one I’m stuck on currently is an apparent conflict between the inference endpoint which seems to be created both when deploying the model to the instance and through the setup script. There are name mismatches and I’m having difficulty cleaning it up. Claude is helping but I’m still struggling -
The issue is clear now - there's a mismatch between the inference endpoint ID and the model ID. The inference endpoint is
looking for a model called "elser-incident-analysis" but the actual model is called ".elser_model_2_linux-x86_64". This is
a configuration problem.
Run the DELETE command in Kibana Dev Tools as originally suggested:
DELETE _inference/elser-incident-analysis
After running that command, wait 30-60 seconds for the deletion to propagate through the cluster, then run the reset
script again:
python3 reset_all.py
The reset script has the logic to detect and use the correct ELSER model name, but it keeps encountering rate limits and
the endpoint persistence issue when trying to recreate it.