Access issues for gated repos

hi @sanchitamore
This should work. In fact, if you log in, you don’t even need token parameter.

from transformers import pipeline 

class LLM:
    def __init__(self, model_name, auth_token=None):
        self.model = pipeline('text2text-generation', model=model_name)

    def predict(self, prompt, **kwargs):
        return self.model(text_inputs=prompt, **kwargs)[0]["generated_text"]

model = LLM(model_name="mistralai/Mistral-7B-Instruct-v0.3")

Can you please run huggingface-cli whoami and double check repo permissions from Hugging Face – The AI community building the future. (Edit permissions → Repositories permissions)

You have to see repo name in the list:

It doesn’t matter whether you are logged in or not, this will work:

from transformers import pipeline 

class LLM:
    def __init__(self, model_name, token=None):
        self.model = pipeline('text2text-generation', model=model_name, token=token)

    def predict(self, prompt, **kwargs):
        return self.model(text_inputs=prompt, **kwargs)[0]["generated_text"]

model = LLM(model_name="mistralai/Mistral-7B-Instruct-v0.3", token="your_token_should_be_here")