How to convert T5X model into a PyTorch model

Hello Huggingface community!

I currently have a pre-trained T5X model that I would like to convert into a PyTorch model for use with the Transformers library. To start, I have tried using the conversion script located in the transformers library within the following path: transformers/src/transformers/models/t5/convert_t5x_checkpoint_to_pytorch.py.

However, when I run the script in Colab all it seems to do it list out the configurations of the config.json file. The model does not seem to have been converted or saved anywhere. Any help would be appreciated.

!python3 /usr/local/lib/python3.10/dist-packages/transformers/models/t5/convert_t5x_checkpoint_to_pytorch.py --t5x_checkpoint_path=gcloud bucket path --config_file=/content/config/config.json --pytorch_dump_path=/content/pytorch

And here is the output:

Building PyTorch model from configuration: T5Config {
  "architectures": [
    "T5WithLMHeadModel"
  ],
  "d_ff": 65536,
  "d_kv": 128,
  "d_model": 1024,
  "decoder_start_token_id": 0,
  "dense_act_fn": "relu",
  "dropout_rate": 0.1,
  "eos_token_id": 1,
  "feed_forward_proj": "relu",
  "initializer_factor": 1.0,
  "is_encoder_decoder": true,
  "is_gated_act": false,
  "layer_norm_epsilon": 1e-06,
  "model_type": "t5",
  "n_positions": 512,
  "num_decoder_layers": 24,
  "num_heads": 128,
  "num_layers": 24,
  "output_past": true,
  "pad_token_id": 0,
  "relative_attention_max_distance": 128,
  "relative_attention_num_buckets": 32,
  "task_specific_params": {
    "summarization": {
      "early_stopping": true,
      "length_penalty": 2.0,
      "max_length": 200,
      "min_length": 30,
      "no_repeat_ngram_size": 3,
      "num_beams": 4,
      "prefix": "summarize: "
    },
    "translation_en_to_de": {
      "early_stopping": true,
      "max_length": 300,
      "num_beams": 4,
      "prefix": "translate English to German: "
    },
    "translation_en_to_fr": {
      "early_stopping": true,
      "max_length": 300,
      "num_beams": 4,
      "prefix": "translate English to French: "
    },
    "translation_en_to_ro": {
      "early_stopping": true,
      "max_length": 300,
      "num_beams": 4,
      "prefix": "translate English to Romanian: "
    }
  },
  "transformers_version": "4.30.1",
  "use_cache": true,
  "vocab_size": 32128
}

Generate config GenerationConfig {
  "_from_model_config": true,
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.30.1"
}

^C