Storing context for a question answering model deployed on azure

Hello everyone,

I’ve deployed a question-answering model on azure, specifically deepset-roberta-base-squad2, and I have a list of personalized questions and answers that I want to pass as context. In the documentation, when the API is consumed, the context is being passed along with the input question. Is there any way to make it so that the context is stored with the model on azure and only the question has to be passed to the endpoint?