repo_id=“mistralai/Mistral-7B-Instruct-v0.3”
llm=HuggingFaceEndpoint(repo_id=repo_id,max_length=150,temperature=0.7,token=os.getenv(“HF_TOKEN”))
llm.invoke(“What is machine learning”)
error :
TypeError: InferenceClient.text_generation() got an unexpected keyword argument ‘max_length’
IDK how to fix this. I’ve tried everthing. its just not working.
1 Like
Try:
#llm=HuggingFaceEndpoint(repo_id=repo_id,max_length=150,temperature=0.7,token=os.getenv("HF_TOKEN"))
llm=HuggingFaceEndpoint(repo_id=repo_id,max_new_tokens=150,temperature=0.7,token=os.getenv("HF_TOKEN"))