If I choose INDUCTOR as dynamo backend(in fact, this is default config on my machine), it reports an error No module named ‘torch._dynamo’ when executing following code:
model, optimizer, data = accelerator.prepare(model, optimizer, data)
I don’t know how to solve this problem, it seems not caused by a missing module. Also I found the code inside the accelerator.prepare() method that triggers the error is following:
def prepare_model(self, model: torch.nn.Module, device_placement=None):
"""
Prepares a PyTorch model for training in any distributed setup. It is recommended to use
[`Accelerator.prepare`] instead.
Args:
model (`torch.nn.Module`):
A PyTorch model to prepare. You don't need to prepare a model if it is used only for inference without
any kind of mixed precision
device_placement (`bool`, *optional*):
Whether or not to place the model on the proper device. Will default to `self.device_placement`.
"""
if device_placement is None:
device_placement = self.device_placement and self.distributed_type != DistributedType.FSDP
self._models.append(model)
if device_placement:
model = model.to(self.device)
if self.state.dynamo_backend != DynamoBackend.NO:
import torch._dynamo as dynamo // <- No module named 'torch._dynam'
I use python 3.8, pytorch 1.12.1, accelerate 0.15.0.