Using the server less inference with the following request as per the documentation:
curl https://api-inference.huggingface.co/models/microsoft/DialoGPT-large \
-X POST \
-d '{
"inputs":{
"past_user_inputs":[
"Which movie is the best ?"
],
"generated_responses":[
"It is Die Hard for sure."
],
"text":"Can you explain why ?"
}
}' \
-H "Authorization: Bearer <API Key>"
I get the following response:
{
"error":"unknown error",
"warnings":[
"There was an inference error: unknown error: can only concatenate str (not \"dict\") to str"
]
}
Update: I managed to get something by stringifying the body, however:
curl https://api-inference.huggingface.co/models/microsoft/DialoGPT-large \
-X POST \
-d '{\"inputs\": {\"past_user_inputs\": [\"Which movie is the best ?\"], \"generated_responses\": [\"It is Die Hard for sure.\"], \"text\":\"Can you explain why ?\"}}' \
-H "Authorization: Bearer <API Key>"
The response is:
[{"generated_text":"{\\\"inputs\\\": {\\\"past_user_inputs\\\": [\\\"Which movie is the best ?\\\"], \\\"generated_responses\\\": [\\\"It is Die Hard for sure.\\\"], \\\"text\\\":\\\"Can you explain why ?\\\"}}ourse"}]%
I gave up using HuggingFace for this use-case - completely unusable despite having reached their support:
Thanks for your patience while we looked into this! We made a few changes on our side, where conversational has been changed to raw text-generation instead. We’re in the process of updating this and the documentation
So I have concluded that DialoGPT is not usable for conversational use-cases on HuggingFace (which is obviously untrue as their chat demo works).
.