I’m working on T5/MT5 model and I want to deploy T5 on tf-serving which use saved_model to serving.
I found the saved_model which huggingface saved is different with the original tf-t5-saved_model.
-
I use
TFT5ForConditionalGeneration.save_pretrained(path-to-save, saved_model=True)
to save saved_model in huggingface.
The saved_model detail show as:
-
I use original t5 export.sh to export saved_model:
see image in dicuss fig1 (new user can only post 1 image)
The saved_model detail show as:
see image in dicuss fig2
So, how can i use huggingface transformer export saved_model like original-t5?
Or, how to covert huggingface .h5 model to original tf-checkpoint?
Someone help, plz