eland_import_hub_model --url https://elastic:changeme@127.0.0.1:9200 --hub-model-id sentence-transformers/clip-ViT-B-32-multilingual-v1 --task-type text_embedding --start --ca-certs /www/elasticsearch_0806/elasticsearch-8.6.0/config/certs/http_ca.crt
2023-05-30 23:37:40,434 INFO : Establishing connection to Elasticsearch
2023-05-30 23:37:40,450 INFO : Connected to cluster named 'elasticsearch' (version: 8.6.0)
2023-05-30 23:37:40,450 INFO : Loading HuggingFace transformer tokenizer and model 'sentence-transformers/clip-ViT-B-32-multilingual-v1'
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2007, in from_pretrained
resolved_archive_file = cached_path(
File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 284, in cached_path
output_path = get_from_cache(
File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 554, in get_from_cache
raise ValueError(
ValueError: Connection error, and we cannot find the requested files in the cached path. Please try again or make sure your Internet connection is on.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/bin/eland_import_hub_model", line 197, in <module>
tm = TransformerModel(args.hub_model_id, args.task_type, args.quantize)
File "/usr/local/lib/python3.10/site-packages/eland/ml/pytorch/transformers.py", line 578, in __init__
self._traceable_model = self._create_traceable_model()
File "/usr/local/lib/python3.10/site-packages/eland/ml/pytorch/transformers.py", line 686, in _create_traceable_model
model = _SentenceTransformerWrapperModule.from_pretrained(self._model_id)
File "/usr/local/lib/python3.10/site-packages/eland/ml/pytorch/transformers.py", line 291, in from_pretrained
model = AutoModel.from_pretrained(model_id, torchscript=True)
File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 446, in from_pretrained
return model_class.from_pretrained(pretrained_model_name_or_path, *model_args, config=config, **kwargs)
File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2088, in from_pretrained
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks like sentence-transformers/clip-ViT-B-32-multilingual-v1 is not the path to a directory containing a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'
Please see Dec 10th, 2022: [EN] Asking top notch technical questions to get you help quicker! on the best approach to ask questions to get you the assistance you need. Simply posting some code output with no further context isn't super helpful sorry to say.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.