How to push trained models and save it privately on hub?

Hi, I have already trained a few models and now I want to push all models to the hub but keep them private.

1 Like

Hi,

Pushing to the private hub is as easy as model.push_to_hub(private=True)

1 Like

I have use same line for pushing model, tokenizer but how to push training args?

1 Like

same line for pushing model
+1

Sure, here’s the docs: Models

1 Like

When I try trainer.push_to_hub(private=True), I receive the error:

    trainer.push_to_hub(private=True)
  File "/lfs/ampere8/0/rschaef/miniconda3/envs/reward_modeling_env/lib/python3.10/site-packages/transformers/trainer.py", line 4301, in push_to_hub
    self.create_model_card(model_name=model_name, **kwargs)
TypeError: Trainer.create_model_card() got an unexpected keyword argument 'private'
[rank0]: Traceback (most recent call last):
[rank0]:   File "/lfs/ampere8/0/rschaef/reward_model_switching_mitigates_overoptimization/src/sft/trainer_sft.py", line 464, in <module>
[rank0]:     main()
[rank0]:   File "/lfs/ampere8/0/rschaef/reward_model_switching_mitigates_overoptimization/src/sft/trainer_sft.py", line 459, in main
[rank0]:     trainer.push_to_hub(private=True)
[rank0]:   File "/lfs/ampere8/0/rschaef/miniconda3/envs/reward_modeling_env/lib/python3.10/site-packages/transformers/trainer.py", line 4301, in push_to_hub
[rank0]:     self.create_model_card(model_name=model_name, **kwargs)
[rank0]: TypeError: Trainer.create_model_card() got an unexpected keyword argument 'private'

Hi,

What’s your Transformers version? Perhaps it is outdated

I hit this problem again. Version ‘4.44.0’

Looking at the method signature, it looks like this method indeed doesn’t support the private keyword argument.

Edit: see answer below :slight_smile:

1 Like

You should provide hub_private_repo=True in TrainingArgs, see here: Trainer.model.push_to_hub() does not allow a private repository flag · Issue #32909 · huggingface/transformers · GitHub

I’m confused by that GitHub thread for two reasons:

  1. It seems like the TrainingArgs’s hub_private_repo=True only apply for the creation of the repo?

  2. Regardless, I think it’d be good to have the ability to call trainer.push_to_hub(private=True) without needing to know to set TrainingArgs.

Should I still open an issue and maybe work on a PR?

@RylanSchaeffer
I have created an issue at add `private` parameter to trainer.push_to_hub · Issue #33492 · huggingface/transformers · GitHub to add this feature to the transformers library.

Although @nicoberk 's method does work as specified in the code source in here and I highly recommend using it at the moment, but I also agree with you that we should get unified parameters in the push to hub method.

1 Like