Reuse context for more questions in question answering

I want to reuse the context in my QA system. I want to answer more questions on the same context and I want to avoid to load the context for any answer.
I’m tying to use the code below. There is a way to reuse the context, i.e. to load the context only once?

from transformers import pipeline

nlp_qa = pipeline(
    'question-answering',
    model='mrm8488/bert-italian-finedtuned-squadv1-it-alfa',
    tokenizer='mrm8488/bert-italian-finedtuned-squadv1-it-alfa'
)

nlp_qa(
    {
        'question': 'Per quale lingua stai lavorando?',
        'context': 'Manuel Romero è colaborando attivamente con HF / trasformatori per il trader del poder de las últimas ' +
       'técnicas di procesamiento de lenguaje natural al idioma español'
    }
)

With most implementations, this is not possible. The models combine question and context such that both can attend to each other since the first layer.

There may be some implementation that only performs question/context attention as a last operation but I am not aware of it.