Using loaded model with accelerate for inference

Thanks for reply
I used this snppet for loading checkpoint

model = load_checkpoint_and_dispatch(model, checkpoint)
Traceback (most recent call last):
  File "/home/ubuntu/trans_test.py", line 13, in <module>
    model = load_checkpoint_and_dispatch(model, checkpoint)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/accelerate/big_modeling.py", line 355, in load_checkpoint_and_dispatch
    if offload_state_dict is None and "disk" in device_map.values():
AttributeError: 'NoneType' object has no attribute 'values'

I know it’s noobish but can guide me a little bit on how to use model on CPU? :slightly_smiling_face: