Hello everyone,
I have been trying to use Llama 2 with the following code:
from langchain.llms import HuggingFaceHub
google_kwargs = {'temperature':0.6, 'max_length': 64}
llm = HuggingFaceHub(repo_id='meta-llama/Llama-2-7b-chat', huggingfacehub_api_token=hugging_face_token, model_kwargs=google_kwargs)
name = llm('I want to open an Italian restaurant, suggest me a name for this')
print(name)
I have been authorized by both META and HuggingFace, but I cannot do anything.
The problem is the same when I use the meta-llama/Llama-2-7b-chat-hf version, in that case it says that I must obtain the PRO version.
Is there a way to fix it?
Many thanks.
Llama 2 doesnât seem to support the Inference API, so you may have to pay to use this specific version of the model. Do you already have a PRO account? Otherwise, I donât believe thereâs anyway to bypass the issue