I liked very much your post “Hugging Face Transformers BERT fine-tuning using Amazon SageMaker and Training Compiler” (and its promises of saving time and money when training NLP models ) and I did try it with a T5 model.
However, I got the following error:
The training task fails due to a missing XLA configuration
I found an article from AWS on “Training Job Fails Due to Missing XLA Configuration” but it did not fix the problem.
Do you have any suggestion? Thank you.