So my question is about inference API's. Why there is no option to specify the dimensions. In my case I want to use text-embedding-large-3 with 1536 dimensions but when i try to create this.
PUT _inference/text_embedding/openai_embeddings
{
"service": "openai",
"service_settings": {"api_key": "sk-proj-tTA2nwQRcka2DCG5aSO0T3BlbkFJKSpOnyCNOnAHWkYSsoia"},
"task_settings": {"model": "text-embedding-large-3",
"dimensions": 1536
}
}
I get "Model configuration contains settings [{dimensions=1536}] unknown to the [openai] service".