GETTING ERROR >> AttributeError: 'InferenceClient' object has no attribute 'post'

HI ALL , BELOW IS MY CODE:

from langchain.llms import HuggingFaceHub

from langchain import PromptTemplate, LLMChain

repo_id=“mistralai/Mistral-7B-Instruct-v0.2”

response=llm.invoke(“what is the capital of USA”)

print(response)

ERROR BELOW:
AttributeError: ‘InferenceClient’ object has no attribute ‘post’

1 Like

Ongoing issue.

Solution for SentenceTransformers:

ANY SOLUTION GUYS ??

KINDLY PLEASE PROVIDE THE SOLUTION

1 Like

Hmm… For example, for langchain: I am getting this error on langchain

same

1 Like

Hmm… It seems that this cannot be resolved with a user patch…
I think you need to raise an issue to resolve this.

STILL NO SOLUTION OVER THIS ???

KINDLY RESOLVE: AttributeError: ‘InferenceClient’ object has no attribute ‘post’

1 Like

Seems WIP?

It may be possible to fix this on the langchain side, but I don’t think it has been done yet.

You’re using InferenceClient which doesn’t have a .post() method. LangChain expects HuggingFaceHub for this to work.

Swap it to:

from langchain.llms import HuggingFaceHub
llm = HuggingFaceHub(repo_id=“mistralai/Mistral-7B-Instruct-v0.2”, huggingfacehub_api_token=“your_token”)

That’ll stop the .post() error.

Solution provided by Triskel Data Deterministic AI

1 Like

your solution is NOT WORKING

1 Like

Dear Respected Brother, I never degenerated your solution… When I ran your code on colab, its showing Bugs or error…
I mean no offense … Please kindly run your solution first and check whether its working before passing it online…

Good day bro

2 Likes

I also wasted lot of time for this… I wastrying a lot and found everyone is struggling with the same issue.

1 Like

Me too going with the same error
AttributeError: ‘InferenceClient’ object has no attribute ‘post’
the code snippet is shared below
from langchain import HuggingFaceHub, LLMChain
from langchain.prompts import PromptTemplate
os.environ[“HF_TOKEN”] = “my_token”

llm=HuggingFaceHub(repo_id=“mistralai/Mistral-7B-Instruct-v0.2”)

prompt = PromptTemplate(
input_variables=[“product”],
template=“What is the best name for startup of {product}?”
)

chain= LLMChain(prompt=prompt,llm=llm,verbose=True)
print(chain.run(“camera”))

1 Like


Can anyone help me with this issue?

1 Like

Use below code: This will resolve and you can prompt:

repo_id=“mistralai/Mistral-7B-Instruct-v0.3”

llm = HuggingFaceEndpoint(
repo_id=repo_id,
max_new_tokens=512,
top_k=10,
top_p=0.95,
typical_p=0.95,
temperature=0.01,
repetition_penalty=1.03,
huggingfacehub_api_token=sec_key
)

print(llm.invoke("write a poem on deep learning , for children age 10 "))

1 Like


There is an issue with HuggingFaceEndpoint package

#from langchain_huggingface import HuggingFaceEndpoint
from langchain_community.llms import HuggingFaceEndpoint
from langchain import PromptTemplate, LLMChain
import os

repo_id="mistralai/Mistral-7B-Instruct-v0.3"

llm = HuggingFaceEndpoint(
    repo_id=repo_id,
    max_new_tokens=512,
    top_k=10,
    top_p=0.95,
    typical_p=0.95,
    temperature=0.01,
    repetition_penalty=1.03,
    huggingfacehub_api_token="hf_*****"
)

print(llm.invoke("write a poem on deep learning , for children age 10 "))
1 Like

Even I am facing the same issue:

1 Like

InferenceClient.post has been completely deprecated, so I think that older library implementations that call it will not work.
We need to find a way to avoid this.