[Not working] QA inference API and conv-ai

Conversational AI and QA API are not working in my browser. Can anyone please check it out.

The “Loading” message is persistent and nothing happens after it. This is the first time i am using your web inference API. Am I missing something?

That might be a fluke, working for me. Can you try agian ?

cc. @julien-c

@valhalla the bot is not replying for me. Anyway, I tried run the backend script in colab here. And working fine but in some cases the bot is questioning and answering itself. Inference time is good in colab.

Hi, the QA Inference is working, must have been a temporary hiccup.

The ConvAI demo on the other hand, is currently down, as the server crashed – what we should do is merge this model and conversational API into the Inference API (cc @mfuntowicz @Narsil)

1 Like

Hey @julien-c, just a doubt, can we improve the 2019 conv-ai model using RAG or DialoGPT model which were released very recently?

@thomwolf can chime in, our 2019 ConvAI model is a slightly different approach than what DialoGPT did a few months later. We used personality embeddings, etc. – check the paper for details.

If you need to parametrize by personality our ConvAI model might be better.

Hello @julien-c Is the ConvAI demo still down? I can’t seem to get a response from the ai.
Thank you sir

Hi @Confleis yes, that demo is still down. We’ll merge it into our current Inference infrastructure at some point but I don’t currently have an ETA.

You should be able to run it locally pretty easily though (including on Collab)

1 Like

Great! Thank you for the quick reply @julien-c