Course: Vector Search
Version: Current
Question: On Lab 2, on the very beginning, we have to execute this command:
docker run -it --rm --network host \
docker.elastic.co/eland/eland:8.9.0 eland_import_hub_model \
--url https://localhost:9200 -u training -p nonprodpwd \
--hub-model-id sentence-transformers/msmarco-MiniLM-L-12-v3 \
--task-type text_embedding --insecure
It does not work, even after trying the solution already found on another discussion:
docker run -it --rm --network host \
docker.elastic.co/eland/eland:8.9.0 eland_import_hub_model \
--url https://localhost:9200 -u training -p nonprodpwd \
--hub-model-id sentence-transformers/msmarco-MiniLM-L12-v3 \
--task-type text_embedding --insecure
The error is the following:
Unable to find image 'docker.elastic.co/eland/eland:8.9.0' locally
8.9.0: Pulling from eland/eland
bd73737482dd: Pull complete
014546664b1e: Pull complete
73ab16005490: Pull complete
4f4fb700ef54: Pull complete
0b301405001e: Pull complete
Digest: sha256:66aa71b88b279a2aa36d8950855bf7069e19536a3b4211742f888d3c1ce20b60
Status: Downloaded newer image for docker.elastic.co/eland/eland:8.9.0
2025-07-24 09:53:01,852 INFO : Establishing connection to Elasticsearch
/usr/local/lib/python3.9/dist-packages/elasticsearch/_sync/client/__init__.py:397: SecurityWarning: Connecting to 'https://localhost:9200' using TLS with verify_certs=False is insecure
_transport = transport_class(
/usr/local/lib/python3.9/dist-packages/urllib3/connectionpool.py:1056: InsecureRequestWarning: Unverified HTTPS request is being made to host 'localhost'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
warnings.warn(
2025-07-24 09:53:02,193 INFO : Connected to cluster named 'cluster1' (version: 8.16.0)
2025-07-24 09:53:02,194 INFO : Loading HuggingFace transformer tokenizer and model 'sentence-transformers/msmarco-MiniLM-L12-v3'
Traceback (most recent call last):
File "/usr/local/lib/python3.9/dist-packages/transformers/configuration_utils.py", line 594, in _get_config_dict
resolved_config_file = cached_path(
File "/usr/local/lib/python3.9/dist-packages/transformers/file_utils.py", line 1921, in cached_path
output_path = get_from_cache(
File "/usr/local/lib/python3.9/dist-packages/transformers/file_utils.py", line 2217, in get_from_cache
http_get(url_to_download, temp_file, proxies=proxies, resume_size=resume_size, headers=headers)
File "/usr/local/lib/python3.9/dist-packages/transformers/file_utils.py", line 2062, in http_get
r = requests.get(url, stream=True, proxies=proxies, headers=headers)
File "/usr/local/lib/python3.9/dist-packages/requests/api.py", line 73, in get
return request("get", url, params=params, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/requests/api.py", line 59, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/requests/sessions.py", line 575, in request
prep = self.prepare_request(req)
File "/usr/local/lib/python3.9/dist-packages/requests/sessions.py", line 486, in prepare_request
p.prepare(
File "/usr/local/lib/python3.9/dist-packages/requests/models.py", line 368, in prepare
self.prepare_url(url, params)
File "/usr/local/lib/python3.9/dist-packages/requests/models.py", line 439, in prepare_url
raise MissingSchema(
requests.exceptions.MissingSchema: Invalid URL '/api/resolve-cache/models/sentence-transformers/msmarco-MiniLM-L12-v3/04138389795d8715e1298fc51c0dc7d18d9670cd/config.json?%2Fsentence-transformers%2Fmsmarco-MiniLM-L12-v3%2Fresolve%2Fmain%2Fconfig.json=&etag=%2282164f1c64ff11b3cc6a184963e8534a7d2c9980%22': No scheme supplied. Perhaps you meant https:///api/resolve-cache/models/sentence-transformers/msmarco-MiniLM-L12-v3/04138389795d8715e1298fc51c0dc7d18d9670cd/config.json?%2Fsentence-transformers%2Fmsmarco-MiniLM-L12-v3%2Fresolve%2Fmain%2Fconfig.json=&etag=%2282164f1c64ff11b3cc6a184963e8534a7d2c9980%22?
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/bin/eland_import_hub_model", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.9/dist-packages/eland/cli/eland_import_hub_model.py", line 235, in main
tm = TransformerModel(
File "/usr/local/lib/python3.9/dist-packages/eland/ml/pytorch/transformers.py", line 617, in __init__
self._tokenizer = transformers.AutoTokenizer.from_pretrained(
File "/usr/local/lib/python3.9/dist-packages/transformers/models/auto/tokenization_auto.py", line 484, in from_pretrained
config = AutoConfig.from_pretrained(
File "/usr/local/lib/python3.9/dist-packages/transformers/models/auto/configuration_auto.py", line 637, in from_pretrained
config_dict, _ = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/transformers/configuration_utils.py", line 546, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/transformers/configuration_utils.py", line 630, in _get_config_dict
raise EnvironmentError(
OSError: Can't load config for 'sentence-transformers/msmarco-MiniLM-L12-v3'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'sentence-transformers/msmarco-MiniLM-L12-v3' is the correct path to a directory containing a config.json file
Is there a solution ?
Thanks, Ingrid