So, I needed to have an English Translated version of some texts. For this, first, I used google trans, and this worked fine. It was able to complete about 7-8 translations per second. Then, I tried using Facebook 600M distilled NLLB. But, this took about 10 seconds for a translation. Now, I ran both the code on Google Colab. Is this the expected thing? Or is something wrong?
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM, pipeline
tokenizer = AutoTokenizer.from_pretrained("facebook/nllb-200-distilled-600M")
model = AutoModelForSeq2SeqLM.from_pretrained("facebook/nllb-200-distilled-600M")
translator_nllb = pipeline('translation', model=model, tokenizer=tokenizer, src_lang="ory_Orya", tgt_lang='eng_Latn', max_length = 400)
translated_text_nllb = translator_nllb(text)[0]['translation_text']