ESRE Engineer Course Labs 5.1

Course: Elasticsearch Relevance Engine (ESRE) Engineer
Question: I am attempting to get through the Course Labs, 5.1: Introduction to Vector Search. When I run the Docker terminal command in Step 1, I receive an error:

docker run -it --rm --network host \
docker.elastic.co/eland/eland:8.9.0 eland_import_hub_model \
--url https://localhost:9200 -u elastic -p nonprodpwd \
--hub-model-id sentence-transformers/msmarco-MiniLM-L-12-v3 \
--task-type text_embedding --insecure
2025-06-09 22:19:35,277 INFO : Establishing connection to Elasticsearch
/usr/local/lib/python3.9/dist-packages/elasticsearch/_sync/client/__init__.py:397: SecurityWarning: Connecting to 'https://localhost:9200' using TLS with verify_certs=False is insecure
  _transport = transport_class(
/usr/local/lib/python3.9/dist-packages/urllib3/connectionpool.py:1056: InsecureRequestWarning: Unverified HTTPS request is being made to host 'localhost'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
  warnings.warn(
2025-06-09 22:19:35,527 INFO : Connected to cluster named 'cluster1' (version: 8.11.0)
2025-06-09 22:19:35,528 INFO : Loading HuggingFace transformer tokenizer and model 'sentence-transformers/msmarco-MiniLM-L-12-v3'
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/dist-packages/transformers/configuration_utils.py", line 594, in _get_config_dict
    resolved_config_file = cached_path(
  File "/usr/local/lib/python3.9/dist-packages/transformers/file_utils.py", line 1921, in cached_path
    output_path = get_from_cache(
  File "/usr/local/lib/python3.9/dist-packages/transformers/file_utils.py", line 2131, in get_from_cache
    raise OSError(
OSError: Distant resource does not have an ETag, we won't be able to reliably ensure reproducibility.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/eland_import_hub_model", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.9/dist-packages/eland/cli/eland_import_hub_model.py", line 235, in main
    tm = TransformerModel(
  File "/usr/local/lib/python3.9/dist-packages/eland/ml/pytorch/transformers.py", line 617, in __init__
    self._tokenizer = transformers.AutoTokenizer.from_pretrained(
  File "/usr/local/lib/python3.9/dist-packages/transformers/models/auto/tokenization_auto.py", line 484, in from_pretrained
    config = AutoConfig.from_pretrained(
  File "/usr/local/lib/python3.9/dist-packages/transformers/models/auto/configuration_auto.py", line 637, in from_pretrained
    config_dict, _ = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "/usr/local/lib/python3.9/dist-packages/transformers/configuration_utils.py", line 546, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "/usr/local/lib/python3.9/dist-packages/transformers/configuration_utils.py", line 630, in _get_config_dict
    raise EnvironmentError(
OSError: Can't load config for 'sentence-transformers/msmarco-MiniLM-L-12-v3'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'sentence-transformers/msmarco-MiniLM-L-12-v3' is the correct path to a directory containing a config.json file

There is no conflicting directory that I can see. I was able to download the model separately, but Kibana doesn't recognize the download and I can't synchronize it.

Welcome to the discuss forum,

Thank you for raising this issue. It seems like the model name has changed in Hugging Face. Could you try the following command install the model?

    docker run -it --rm --network host \
    docker.elastic.co/eland/eland:8.9.0 eland_import_hub_model \
    --url https://localhost:9200 -u elastic -p nonprodpwd \
    --hub-model-id sentence-transformers/msmarco-MiniLM-L12-v3 \
    --task-type text_embedding --insecure

The naming difference is the removal of a "-" in between the L and 12 in the name.

Thank you! The updated command worked. Should have caught that given everything else I tried.

Im recieving the same error and I checked huggingface website to confirm the modelname. I even downloaded it and tried to load it offline but the issue still exists. I wrote a python code to load the model and that worked too.

This is my command:

docker run -it --rm --network host docker.elastic.co/eland/eland:8.9.0 eland_import_hub_model --url https://localhost:9200 -u elastic -p nonprodpwd --hub-model-id sentence-transformers/msmarco-MiniLM-L12-v3 --task-type text_embedding --insecure

Hello,

Same problem today, you have to adjust the eland image version in order to make it work.

For me, it was:

docker run -it --rm --network host docker.elastic.co/eland/eland:8.12.0 eland_import_hub_model --url https://localhost:9200 -u elastic -p nonprodpwd --hub-model-id sentence-transformers/msmarco-MiniLM-L12-v3 --task-type text_embedding --insecure

Hope It can help.

Ingrid