I was wondering if it is possible to generate the answer template from the question.
The scenario is quite like this, where my system will be given a large text (e.g. one on a topic like MongoDB) asked a simple question (like “when was MongoDB created?”) and it is expected to provide a somewhat descriptive answer (like “MongoDB was created in 2007” as opposed to mere “2007”).
For that I am planning to use two different models,
- one for actually finding the answer
- and the other for generating human conversation
The idea is to,
- generate the answer template from the posed question with the key spot blank (like, “MongoDB was created on ___”)
- then ask the first model to actually extract the required information from the text provided
- then plug in the extracted data into the blank space of the template
Please let me know if it is possible.