Using Langchain ChatHuggingface with Text Generation Inference: missing field `inputs`

Okay, the solution was pretty trivial.
I changed the endpoint_url to “http://localhost:8080/v1/chat/completions”.
It now works with messages.

I guess using the official Inference API from Huggingface chooses the correct url for you, but when you self-host you have to manually specify the url like that in order to use the Messages API. Otherwise it uses the “/generate” endpoint, which requires an inputs field.