Import Error while loading a fine-tuned distilbert model

Hi,

While loading huggingface/distilbert-base-uncased-finetuned-mnli I get ModuleNotFoundError.
If you trace back the error, it reaches a point says this (‘keras.saving.hdf5_format’) module is not found. I suspect this might be due to the Keras and TensorFlow upgrade to the latest version I did.

Any recommendation except rolling back to an older version of Keras?

Below I am sharing more details what happens and what environment I am running on:

Chip (CPU/GPU) Type: Apple M1
OS: macOS Ventura 13.1
Python Version: 3.8.8 | packaged by conda-forge
TensorFlow Version: 2.8.0
TensorFlow Metal Version (plug-in): 0.7.0
Keras Version: 2.11.0
Transformers Version: 4.24.0

The code I am trying to run is:

import transformers
from transformers import AutoTokenizer, TFAutoModelForSequenceClassification

model_name = "huggingface/distilbert-base-uncased-finetuned-mnli"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = TFAutoModelForSequenceClassification.from_pretrained(model_name)

Error Trace:

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
~/miniforge3/envs/mlm1-engine/lib/python3.8/site-packages/transformers/utils/import_utils.py in _get_module(self, module_name)
   1075         try:
-> 1076             return importlib.import_module("." + module_name, self.__name__)
   1077         except Exception as e:

~/miniforge3/envs/mlm1-engine/lib/python3.8/importlib/__init__.py in import_module(name, package)
    126             level += 1
--> 127     return _bootstrap._gcd_import(name[level:], package, level)
    128 

~/miniforge3/envs/mlm1-engine/lib/python3.8/importlib/_bootstrap.py in _gcd_import(name, package, level)

~/miniforge3/envs/mlm1-engine/lib/python3.8/importlib/_bootstrap.py in _find_and_load(name, import_)

~/miniforge3/envs/mlm1-engine/lib/python3.8/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_)

~/miniforge3/envs/mlm1-engine/lib/python3.8/importlib/_bootstrap.py in _load_unlocked(spec)

~/miniforge3/envs/mlm1-engine/lib/python3.8/importlib/_bootstrap_external.py in exec_module(self, module)

~/miniforge3/envs/mlm1-engine/lib/python3.8/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)

~/miniforge3/envs/mlm1-engine/lib/python3.8/site-packages/transformers/models/distilbert/modeling_tf_distilbert.py in <module>
     33 )
---> 34 from ...modeling_tf_utils import (
     35     TFMaskedLanguageModelingLoss,

~/miniforge3/envs/mlm1-engine/lib/python3.8/site-packages/transformers/modeling_tf_utils.py in <module>
     38 from huggingface_hub import Repository, list_repo_files
---> 39 from keras.saving.hdf5_format import save_attributes_to_hdf5_group
     40 from transformers.utils.hub import convert_file_size_to_int, get_checkpoint_shard_files

ModuleNotFoundError: No module named 'keras.saving.hdf5_format'

The above exception was the direct cause of the following exception:

RuntimeError                              Traceback (most recent call last)
<ipython-input-31-9bf27b263aa0> in <module>
      6 model_name = "huggingface/distilbert-base-uncased-finetuned-mnli"
      7 tokenizer = AutoTokenizer.from_pretrained(model_name)
----> 8 model = TFAutoModelForSequenceClassification.from_pretrained(model_name)

~/miniforge3/envs/mlm1-engine/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
    460             )
    461         elif type(config) in cls._model_mapping.keys():
--> 462             model_class = _get_model_class(config, cls._model_mapping)
    463             return model_class.from_pretrained(
    464                 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs

~/miniforge3/envs/mlm1-engine/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py in _get_model_class(config, model_mapping)
    357 
    358 def _get_model_class(config, model_mapping):
--> 359     supported_models = model_mapping[type(config)]
    360     if not isinstance(supported_models, (list, tuple)):
    361         return supported_models

~/miniforge3/envs/mlm1-engine/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py in __getitem__(self, key)
    588         if model_type in self._model_mapping:
    589             model_name = self._model_mapping[model_type]
--> 590             return self._load_attr_from_module(model_type, model_name)
    591 
    592         # Maybe there was several model types associated with this config.

~/miniforge3/envs/mlm1-engine/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py in _load_attr_from_module(self, model_type, attr)
    602         if module_name not in self._modules:
    603             self._modules[module_name] = importlib.import_module(f".{module_name}", "transformers.models")
--> 604         return getattribute_from_module(self._modules[module_name], attr)
    605 
    606     def keys(self):

~/miniforge3/envs/mlm1-engine/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py in getattribute_from_module(module, attr)
    551     if isinstance(attr, tuple):
    552         return tuple(getattribute_from_module(module, a) for a in attr)
--> 553     if hasattr(module, attr):
    554         return getattr(module, attr)
    555     # Some of the mappings have entries model_type -> object of another model type. In that case we try to grab the

~/miniforge3/envs/mlm1-engine/lib/python3.8/site-packages/transformers/utils/import_utils.py in __getattr__(self, name)
   1064             value = self._get_module(name)
   1065         elif name in self._class_to_module.keys():
-> 1066             module = self._get_module(self._class_to_module[name])
   1067             value = getattr(module, name)
   1068         else:

~/miniforge3/envs/mlm1-engine/lib/python3.8/site-packages/transformers/utils/import_utils.py in _get_module(self, module_name)
   1076             return importlib.import_module("." + module_name, self.__name__)
   1077         except Exception as e:
-> 1078             raise RuntimeError(
   1079                 f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its"
   1080                 f" traceback):\n{e}"

RuntimeError: Failed to import transformers.models.distilbert.modeling_tf_distilbert because of the following error (look up to see its traceback):
No module named 'keras.saving.hdf5_format'

I am not certain, if me replying a solution to my own posted problem is a good idea on this community.

Anyway:
I wanted to remove this, but I think this might be useful to share how I fixed it.
The latest version of Keras (as of now; 2.11.0) has some restructuring, I rolled back to the older version and it is working now.