How to covert huggingface .h5 model to original tf-checkpoint?

I’m working on T5/MT5 model and I want to deploy T5 on tf-serving which use saved_model to serving.
I found the saved_model which huggingface saved is different with the original tf-t5-saved_model.

  1. I use
    TFT5ForConditionalGeneration.save_pretrained(path-to-save, saved_model=True)
    to save saved_model in huggingface.
    The saved_model detail show as:

  2. I use original t5 export.sh to export saved_model:
    see image in dicuss fig1 (new user can only post 1 image)
    The saved_model detail show as:
    see image in dicuss fig2

So, how can i use huggingface transformer export saved_model like original-t5?
Or, how to covert huggingface .h5 model to original tf-checkpoint?

Someone help, plz :slight_smile:

fig1

fig2

before all, i reduced mt5 vocab size to 3.5W followed this blog: https://towardsdatascience.com/how-to-adapt-a-multilingual-t5-model-for-a-single-language-b9f94f3d9c90.
This question is my next step, but i stucked.