Hi all,
Used this code to export mentioned model to ONNX format:
import torch
from transformers import MBartForConditionalGeneration, MBart50TokenizerFast
model_path = "./mbart_large_50_model"
model = MBartForConditionalGeneration.from_pretrained(model_path)
tokenizer = MBart50TokenizerFast.from_pretrained(model_path)
input_text = "This is just simple text"
inputs = tokenizer(input_text, return_tensors="pt")
onnx_path = "./mbart_large_50.onnx"
# Export to ONNX format:
torch.onnx.export(
model, # PyTorch model
(inputs["input_ids"],),
onnx_path, # the path for the resulting ONNX file
input_names=["input_ids"], # input tensor names
output_names=["logits"], # output tensor names
dynamic_axes={"input_ids": {0: "batch", 1: "sequence"}},
opset_version=14
)
Got resulting mbart_large_50.onnx
but I’m completely stuck on how to use it to translate between two languages. Any suggestions would be appreciated.