hello everyone, i’ve been trying to fine tune araT5 for arabic to english MT but i could not get good results at all, here’s the code (i know the spochs size is too big i tried many sizes and made many changes in the hyperparameters before but nothing really cha,nged) i hope i’ll find anyone who can help, version without LoRA: arat5-base arabic2english mt | Kaggle version with LoRA: arat5-base-lora arabic2english MT | Kaggle
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Pretrain T5 for Arabic | 17 | 2660 | June 11, 2023 | |
Fine-tuning BERT for Machine Translation | 0 | 720 | May 21, 2022 | |
How well does a language model perform when fine-tuned on a dialect of its trained language? | 0 | 167 | May 23, 2024 | |
Fine-tuning MT5 on XNLI | 1 | 1740 | October 16, 2021 | |
Poor results (val_loss) on fine-tuning the NLLB-200-600M with LoRA for French-Wolof translation | 3 | 200 | October 1, 2024 |