How to use llm (access fail)

hi @alice86
About your first question:
Did you create an access token? Did you add a permission for relevant repository?

You can check from Hugging Face – The AI community building the future..
(Edit permissions → Repositories permissions)

You need to run something like this:

pipeline = transformers.pipeline(
    "text-generation",
    model=model_id,
    model_kwargs={"torch_dtype": torch.bfloat16},
    device_map="auto",
    token="hf_xxxxxxx",
)