Saving fine-tuned model for VertexAI registry?

Hi all, Newbie here!

I have a fine-tuned LLama2 model using PEFT (LORA Adaptor). I want to register this in the VertexAI model registry. According to the VertexAI documentation , model should be in one of the format defined here : https://cloud.google.com/vertex-ai/docs/training/exporting-model-artifacts
(Tensorflow , Pytorch…etc)

Code snippet for saving the model is as folloows:

new_model = AutoPeftModelForCausalLM.from_pretrained(
    args.output_dir,
    low_cpu_mem_usage=True,
    return_dict=True,
    torch_dtype=torch.float16,
    device_map=device_map,
)

# Merge LoRA and base model
merged_model = new_model.merge_and_unload()
# Save the merged model
merged_model.save_pretrained("merged_model",safe_serialization=True)
tokenizer.save_pretrained("merged_model")

Model folder has this structure:

I assume this is “safetensors” format.
My question is how to save the model in Pytorch, Tensorflow format so that I can export it into the VertexAI Registry?

hey did you resolve the issue ?