Using HuggingFace embedded locally

Hi,
I’m trying to use nomic-ai/nomic-ai/nomic-embed-text-v1.5 and downloaded all files to my machine. I checked this post Using Huggingface Embeddings completely locally, but I still can’t figure out how to as none of the “workaround” shown in the github link in that forum (sorry they only allow single link per post) worked for me. I’m using proxy server which basically refuses connection to anything, so I can’t use transformer as it returns proxy error. But I’m not sure why the code below need access to huggingface.co


returns

Hi,

The model card shows how to use the model entirely locally, see nomic-ai/nomic-embed-text-v1.5 · Hugging Face if you prefer Sentence Transformers and nomic-ai/nomic-embed-text-v1.5 · Hugging Face if you prefer to use the Transformers library.

Here’s how to use it entirely locally:

from transformers import AutoTokenizer, AutoModel

# load the model and tokenizer from the hub
tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
model = AutoModel.from_pretrained("nomic-ai/nomic-embed-text-v1.5", trust_remote_code=True)

# save the the tokenizer files as well as the model weights locally
tokenizer.save_pretrained("path_to_your_local_folder")
model.save_pretrained("path_to_your_local_folder")

# load from local storage
tokenizer = AutoTokenizer.from_pretrained("path_to_your_local_folder")
model = AutoModel.from_pretrained("path_to_your_local_folder")

# proceed with embedding text as shown in the model card

So that’s what I tried to do also but it also gives me proxy error.
requests.exceptions.ProxyError: (MaxRetryError(“HTTPSConnectionPool(host=‘huggingface.co’, port=443): Max retries exceeded with url: /nomic-ai/nomic-bert-2048/resolve/main/tokenizer_config.json (Caused by ProxyError(‘Unable to connect to proxy’, OSError(‘Tunnel connection failed: 403 Forbidden’)))”),

Thank you for your suggestion.
Embedding Model can not used with AutoModel. I try with your method and not success!