How to implement a no-context Question-Answer model like PhilosopherAI?

Hello,
I would like to make something similar to philosopherai.com - i.e. generating answers for questions without providing a context.

How can I do that?
Thanks

1 Like

Hi Federico, thanks for opening this topic. I was fascinated by this website, in particular their pricing and their claims in their FAQs!

I personally don’t think that using smaller versions of GPT-3 (or free versions, for that matter) diminish the “wow-factor” of these text generation models. Here is how I would do it:

Well not quite - once you have the pipeline running you obviously want to play around with different models, different values for parameters, some prompt engineering, etc. As a starting point you you could leverage the code from the PhilosopherAI itself: philosopherai_demo/philosopher_demo.js at master · mayfer/philosopherai_demo · GitHub

That’s pretty much all there is to it :slight_smile:

Below a screenshot to give you an idea what an output from the model GPT-Neo 1.3B (a popular open-source alternative to GPT-3) looks like:

Hope that helps, reach out if any question!

Cheers
Heiko

1 Like

Hello! I am fascinated by your answer.
I really am looking for some model (from huggingface) which is able to take the data from my specific domain and then I should be able ask the question to that model.

Then the model should be able to answer my question from the knowledge learned from the data. I mean, is it possible to “just ask” without giving the context at the asking time?

What process should I follow?
I am a bit new to huggingface and I am looking for some kind of “generative question answering” instead of “extractive question answering”.

Thank you!