How to save model in S3 with Trainer?

When I do
Trainer.save_model(‘s3://path_to_folder’)

I got no error with message
Saving model checkpoint to s3://path_to_folder
Configuration saved in s3://path_to_folder/config.json
Model weights saved in s3://path_to_folder/pytorch_model.bin
tokenizer config file saved in s3://path_to_folder/tokenizer_config.json
Special tokens file saved in s3://path_to_folder/special_tokens_map.json

But corresponding folder in S3 is empty.
How correctly save model in S3 with Trainer.save_model?

2 Likes

Hey @AndriiP did you figure this out? I have a similar issue. I’m running from a Sagemaker notebook instance and it seems to be saving the files as per the log being printed, but the s3 bucket is empty

Thank you

Hi tyatabe,

I did not find the right solution.
I just zip checkpoint folder, save on S3, when needed load, unzip and use it.

Thank you @AndriiP I ended up doing something along the same lines. I found comfort in knowing I wasn’t the only one with this issue :slight_smile:

I believe you could use TrainerCallback to send it to S3 on_save

To use trained models in Sagemaker you can use Sagemaker Training Job. It will train and upload .tar.gz model to S3 for you to use. See here for more: