Found what’s wrong, model.save_pretrained
somehow automatically freezes all LoRA parameters and only keep the last one or two fully connected classifier head layers trainable (didn’t see this behaviour documented), so we just need to set all LoRA parameters trainable again with:
for name, param in model.named_parameters():
if 'lora' in name:
param.requires_grad = True