GETTING ERROR >> AttributeError: 'InferenceClient' object has no attribute 'post'

HI ALL , BELOW IS MY CODE:

from langchain.llms import HuggingFaceHub

from langchain import PromptTemplate, LLMChain

repo_id=“mistralai/Mistral-7B-Instruct-v0.2”

response=llm.invoke(“what is the capital of USA”)

print(response)

ERROR BELOW:
AttributeError: ‘InferenceClient’ object has no attribute ‘post’

1 Like

Ongoing issue.

Solution for SentenceTransformers:

ANY SOLUTION GUYS ??

KINDLY PLEASE PROVIDE THE SOLUTION

1 Like

Hmm… For example, for langchain: I am getting this error on langchain

same

1 Like

Hmm… It seems that this cannot be resolved with a user patch…
I think you need to raise an issue to resolve this.

STILL NO SOLUTION OVER THIS ???

KINDLY RESOLVE: AttributeError: ‘InferenceClient’ object has no attribute ‘post’

1 Like

Seems WIP?

It may be possible to fix this on the langchain side, but I don’t think it has been done yet.

You’re using InferenceClient which doesn’t have a .post() method. LangChain expects HuggingFaceHub for this to work.

Swap it to:

from langchain.llms import HuggingFaceHub
llm = HuggingFaceHub(repo_id=“mistralai/Mistral-7B-Instruct-v0.2”, huggingfacehub_api_token=“your_token”)

That’ll stop the .post() error.

Solution provided by Triskel Data Deterministic AI

1 Like

your solution is NOT WORKING

1 Like

(post deleted by author)

1 Like

Dear Respected Brother, I never degenerated your solution… When I ran your code on colab, its showing Bugs or error…
I mean no offense … Please kindly run your solution first and check whether its working before passing it online…

Good day bro

1 Like