How to use from the Transformers library from Dogge / llama-3-70B-instruct-uncensored

After installing lfs and cloning the transformers library in Powershell 7.4.2 I attempted to put “from transformers import AutoTokenizer, AutoModelForCausalLM” as is in the “use this model” “transformers” popup. But it is saying “from” is not a command. What command should I use to install and run the model locally?