How to implement a no-context Question-Answer model like PhilosopherAI?

Hello,
I would like to make something similar to philosopherai.com - i.e. generating answers for questions without providing a context.

How can I do that?
Thanks

Hi Federico, thanks for opening this topic. I was fascinated by this website, in particular their pricing and their claims in their FAQs!

I personally don’t think that using smaller versions of GPT-3 (or free versions, for that matter) diminish the “wow-factor” of these text generation models. Here is how I would do it:

Well not quite - once you have the pipeline running you obviously want to play around with different models, different values for parameters, some prompt engineering, etc. As a starting point you you could leverage the code from the PhilosopherAI itself: philosopherai_demo/philosopher_demo.js at master · mayfer/philosopherai_demo · GitHub

That’s pretty much all there is to it :slight_smile:

Below a screenshot to give you an idea what an output from the model GPT-Neo 1.3B (a popular open-source alternative to GPT-3) looks like:

Hope that helps, reach out if any question!

Cheers
Heiko