LangChain and Hugging Face Endpoints

Hi all,

I am trying to replicate an example in Langchain documentation:

from langchain_huggingface import HuggingFaceEndpoint

from langchain.chains import LLMChain
from langchain_core.prompts import PromptTemplate

question = "Who won the FIFA World Cup in the year 1994? "
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate.from_template(template)

repo_id = "deepseek-ai/DeepSeek-R1-0528"

llm = HuggingFaceEndpoint(
repo_id=repo_id,
max_length=128,
temperature=0.5,
huggingfacehub_api_token=HUGGINGFACEHUB_API_TOKEN,
provider="auto", # set your provider here ``hf.co/settings/inference-providers
# provider="hyperbolic",
# provider="nebius",
# provider="together",
)
llm_chain = prompt | llm
print(llm_chain.invoke({"question": question}))

And receiving error message:
ValueError: Model deepseek-ai/DeepSeek-R1-0528 is not supported for task text-generation and provider together. Supported task: conversational.

I have tried different models with inference provided and text generarion as a task but always receiving the same error message.

Thank you very much for the support!

1 Like

It seems that at this point, we need to wrap the function