I want to add L1-regularization on some weights in the transformer using the Trainer.
Is it possible to do so ? Or would I have to mess with deeper layers of the api call ?
I want to add L1-regularization on some weights in the transformer using the Trainer.
Is it possible to do so ? Or would I have to mess with deeper layers of the api call ?
Yes, you can overwrite the Trainer:Finetuning BART using custom loss - #2 by lewtun
This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.