Course: Elasticsearch Relevance Engine (ESRE) Engineer
Question: I am attempting to get through the Course Labs, 5.1: Introduction to Vector Search. When I run the Docker terminal command in Step 1, I receive an error:
docker run -it --rm --network host \
docker.elastic.co/eland/eland:8.9.0 eland_import_hub_model \
--url https://localhost:9200 -u elastic -p nonprodpwd \
--hub-model-id sentence-transformers/msmarco-MiniLM-L-12-v3 \
--task-type text_embedding --insecure
2025-06-09 22:19:35,277 INFO : Establishing connection to Elasticsearch
/usr/local/lib/python3.9/dist-packages/elasticsearch/_sync/client/__init__.py:397: SecurityWarning: Connecting to 'https://localhost:9200' using TLS with verify_certs=False is insecure
_transport = transport_class(
/usr/local/lib/python3.9/dist-packages/urllib3/connectionpool.py:1056: InsecureRequestWarning: Unverified HTTPS request is being made to host 'localhost'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
warnings.warn(
2025-06-09 22:19:35,527 INFO : Connected to cluster named 'cluster1' (version: 8.11.0)
2025-06-09 22:19:35,528 INFO : Loading HuggingFace transformer tokenizer and model 'sentence-transformers/msmarco-MiniLM-L-12-v3'
Traceback (most recent call last):
File "/usr/local/lib/python3.9/dist-packages/transformers/configuration_utils.py", line 594, in _get_config_dict
resolved_config_file = cached_path(
File "/usr/local/lib/python3.9/dist-packages/transformers/file_utils.py", line 1921, in cached_path
output_path = get_from_cache(
File "/usr/local/lib/python3.9/dist-packages/transformers/file_utils.py", line 2131, in get_from_cache
raise OSError(
OSError: Distant resource does not have an ETag, we won't be able to reliably ensure reproducibility.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/bin/eland_import_hub_model", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.9/dist-packages/eland/cli/eland_import_hub_model.py", line 235, in main
tm = TransformerModel(
File "/usr/local/lib/python3.9/dist-packages/eland/ml/pytorch/transformers.py", line 617, in __init__
self._tokenizer = transformers.AutoTokenizer.from_pretrained(
File "/usr/local/lib/python3.9/dist-packages/transformers/models/auto/tokenization_auto.py", line 484, in from_pretrained
config = AutoConfig.from_pretrained(
File "/usr/local/lib/python3.9/dist-packages/transformers/models/auto/configuration_auto.py", line 637, in from_pretrained
config_dict, _ = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/transformers/configuration_utils.py", line 546, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/transformers/configuration_utils.py", line 630, in _get_config_dict
raise EnvironmentError(
OSError: Can't load config for 'sentence-transformers/msmarco-MiniLM-L-12-v3'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'sentence-transformers/msmarco-MiniLM-L-12-v3' is the correct path to a directory containing a config.json file
There is no conflicting directory that I can see. I was able to download the model separately, but Kibana doesn't recognize the download and I can't synchronize it.