Hi @philschmid,
I liked very much your post “Hugging Face Transformers BERT fine-tuning using Amazon SageMaker and Training Compiler” (and its promises of saving time and money when training NLP models  ) and I did try it with a T5 model.
 ) and I did try it with a T5 model.
However, I got the following error: The training task fails due to a missing XLA configuration
I found an article from AWS on “Training Job Fails Due to Missing XLA Configuration” but it did not fix the problem.
Do you have any suggestion? Thank you.
             
            
              
              
              
            
            
           
          
            
            
              Hey @pierreguillou,
could you share how you created your Training Job (python estiomator + hyperparameters) and the training script you use?
Iin addition to this, I am not sure if T5 is well supported yet. As mentioned in my blog
The Amazon Training Compiler works best with Encoder Type models, like BERT , RoBERTa , ALBERT , DistilBERT .
The Training Compiler currently works best with encoder-type models. In the test I ran a couple of weeks back T5 performed worse than the default training.
             
            
              
              
              
            
            
           
          
            
            
              
Thanks for your answer @philschmid. Do you plan to work on it?