Loading Peft model from checkpoint leading into size missmatch

Hi,

A reply from @smangrul:

Were you resizing embedding layers before? In the merge and unload code too, you need to resize the embedding layers before loading the PEFT adapters. Now, whenever you resize embedding layers, those get saved along with the adapters as those new tokens were initialised randomly and adapters are tuned wrt those.