Can't instantiate deberta model in Tensorflow with 'mixed_float16' global policy

I’m trying to instantiate a deberta model in Tensorflow for mixed precision training. It instantiates fine in ‘float32’ setting but gives the following InvalidArgumentError in mixed_precision:

Here is my code line, where I load the model offline:

max_length = 768
path = '../input/deberta-v3-large/deberta-v3-large'

# Tokenizer

tokenizer = transformers.AutoTokenizer.from_pretrained(path, use_fast=False)

# Config

config = transformers.AutoConfig.from_pretrained(path, output_hidden_states=True)

# Setting dropouts to 0 for regression task

config.hidden_dropout_prob = 0
config.attention_probs_dropout_prob = 0

from tensorflow.keras import mixed_precision

deberta_model = transformers.TFAutoModel.from_pretrained(path, config=config)

1 Like