Extractive Question Answering vs?


In chapter 7, Question answering, there is the following statement: " Time to look at question answering! This task comes in many flavors, but the one we’ll focus on in this section is called extractive question answering".

I’m guessing that the alternative is inferential or synthetic question answering, where the result is not necessarily a span of a start and end token because the answer might not be in the text. For example, calculating total cost of apples and oranges when only individual costs are provided in the text.

I was wondering where I should look if I would like to investigate inferential question answering - causal LM seems the closest or even summarization or conversation. Are there any existing models that do inferential question answering?