I’ve recently encountered an unexpected warning related to use_reentrant
while working with theTraining arguments. According to my understanding and the documentation, this warning should not typically surface, especially when using the latest versions of the Transformers . However, I’m consistently seeing this warning in my environment, which is currently set up with Transformers version 4.39.3 and PyTorch version 2.3
inorder to overcome this issue I am using
gradient_checkpointing_kwargs={‘use_reentrant’:True} explicitly