Model saved into an unique .h5 file (or TensorflowLight)

Hi there, I saved the model using the following code, that load a pre-trained model, save the model, and then load/save the model in the TF version:

from transformers import AutoModelForImageClassification
model = AutoModelForImageClassification.from_pretrained('google/vit-base-patch16-224')
model.save_pretrained("my_model")

from transformers import TFAutoModelForImageClassification

tf_model = TFAutoModelForImageClassification.from_pretrained("my_model", from_pt=True)
tf_model.save_pretrained("my_model_tf")

In my_model_tf there are 2 different file:

  1. config.json, that contains the “characteristics” of the model
  2. tf_model.h5, that contains the weights of the model

When i try to load tf_model.h5, I have the following Error:

ValueError: No model config found in the file at <tensorflow.python.platform.gfile.GFile object at 0x7f83009fb110>

There is a way to only obtain an UNIQUE .h5 file that contains the whole model?

Hi @dgrnd4! There are some complexities here - our models are implemented by subclassing, which gives users a lot of flexibility but means that it is impossible to save a full implementation of the model, even as a SavedModel. However, you can save specific concrete functions, which is probably what you want to do if you want to create a model artifact for inference or TFLite conversion. You can see the documentation here. For this use-case, I recommend directly using TF/Keras methods like model.save to give you more low-level control over the exported signatures, and you will probably also want to set a signature that corresponds to the exact input tensors and shapes you want to use for your TFLite model.

Also, good luck if you’re attempting a TFLite conversion! We don’t officially support it, but I suspect some of our models will convert okay,

Hi @Rocketknight1 thanks for the reply.

What I was thinking is to create a KERAS model so that at the end I can use the model.save(“model.h5”) function to save everything in one file.

What do you mean with “save specific concrete functions”? The documentation that you send me is about the model already “converted” in TensorFlow so the model that I have right now is impossible to use with this function because an object.

PROBABLY I’m wrong, so I’m sorry in advice :sweat_smile:

Ah, it might be my fault! What is the ultimate goal here, exactly? Are you trying to convert your model to TFLite? Or do you just want to save the model and reload it again with from_pretrained()?

yes, @Rocketknight1 you’re right. My purpose is to convert my model folder (config.json + model.h5) to only one file in the format “.h5” or “TFLite”.

This is my project if you want to look at it!

So from now, if you look at the section “Load Model” of my colab file, I have the model saved in 3 different ways: from here, how can I convert it into an unique TFLite or h5 file?

Hi @Rocketknight1, maybe I get the solution!

Do you know how can I “extract” concrete functions from my model?