How to use llm (access fail)

Q1
I already accept the license.
And the official don’t show any method to download it.

It just direct use

pipeline = transformers.pipeline(
    "text-generation",
    model=model_id,
    model_kwargs={"torch_dtype": torch.bfloat16},
    device_map="auto",
)

Q2
It is different question.
If I download the model and how to load model without token? (already download model)
Because I already download the model.

I expect the result is
python test.py => it will automatically download model with token

python test.py -m ./model/llama3 => It will use the model that I already download without any token.

In fact, I dont see any code for python test.py -m ./model/llama3