Is there way to convert the Donut model to openvino format

While inferencing the donut model in local system it is taking atleast 10 Sec, but in my usecase it should be getting completed in less than a second. If there is anyother approach other than openvino inferencing, let me know that as well.