Keras callback error and model config 'NoneType' after training

Hello there! :grinning:
Thanks for enhancing the world of Translation with Transformers library! :earth_americas: :earth_africa: :earth_asia:


  • Context. Finetunning t5-small with opus100 dataset . Following this script, which is a brief modification of the TFexample

Finding this issue in Code after correct training .

Traceback (most recent call last):
  File "/../The-Lord-of-The-Words-The-two-frameworks/src/models/", line 742, in <module>
  File "/../The-Lord-of-The-Words-The-two-frameworks/src/models/", line 695, in main
    history =, epochs=int(training_args.num_train_epochs), callbacks=callbacks)
  File "/../The-Lord-of-The-Words-The-two-frameworks/.venv3.9/lib/python3.9/site-packages/keras/utils/", line 67, in error_handler
    raise e.with_traceback(filtered_tb) from None
  File "/.../The-Lord-of-The-Words-The-two-frameworks/.venv3.9/lib/python3.9/site-packages/transformers/", line 227, in on_epoch_end
    predictions = self.model.generate(
  File "/../The-Lord-of-The-Words-The-two-frameworks/.venv3.9/lib/python3.9/site-packages/transformers/generation/", line 874, in generate
    and generation_config.min_length > generation_config.max_length
TypeError: '>' not supported between instances of 'int' and 'NoneType'

When I visit I can see that it checks lengths from generation config.
Right now I think generation_config is not being created. Does anyone know if this is a reasonable hypothesis?

I thought that the error was related to the configuration of the model (it was not being created ) , so I tried implementing the script flag --config_name t5-small , but the error persisted.

Any hints of what´s going on at this point or where shall I go from this? :pray::pray:
Thanks in advance!

Managed to solve the issue.
This finally was related to the max_target_length during prediction .
Setting this parameter to 64 solved the issue