Unable to download models from HF with from_pretrained()

I am trying to download the model TheBloke/Llama-2-7b-Chat-GGUF with transformers with the code presented in the model card:

Load model directly

from transformers import AutoModel
model = AutoModel.from_pretrained(“TheBloke/Llama-2-7b-Chat-GGUF”)

This produces an error message:

Traceback (most recent call last):
File “C:\Users\nikoo\PycharmProjects\langmodel\main.py”, line 3, in
model = AutoModel.from_pretrained(“TheBloke/Llama-2-7b-Chat-GGUF”)
File “C:\Users\nikoo\PycharmProjects\langmodel\venv\lib\site-packages\transformers\models\auto\auto_factory.py”, line 563, in from_pretrained
return model_class.from_pretrained(
File “C:\Users\nikoo\PycharmProjects\langmodel\venv\lib\site-packages\transformers\modeling_utils.py”, line 2846, in from_pretrained
raise EnvironmentError(
OSError: TheBloke/Llama-2-7b-Chat-GGUF does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

I understand that it wants one of those files, but they are not found in the repository.

I am running a virtual environment with latest PyTorch and transformers 4.33.2 installed.

1 Like

Did you ever get this figured out? I’m stuck on the same thing.

I am facing the same issue. Anyone figured out yet?