You cannot specify both input_ids and inputs_embeds at the same time

Upgraded to the newest transformers and now my pretrained models are broken.

Used to be I could run:

model = TFRobertaForSequenceClassification.from_pretrained('saved_models/RoBERTa_loglikes_critic_2.0')
tokenizer = RobertaTokenizer.from_pretrained('distilroberta-base')
string = "this is a test"
encoded_string = tokenizer.encode(string)
model(encoded_string)`

But now I get the error:
ValueError: You cannot specify both input_ids and inputs_embeds at the same time

This has ruined several larger keras models that rely on being set up like so:

tokens_input = tf.keras.Input(shape=(None,), dtype=tf.int64, name='tokens_input')
features_input = tf.keras.Input(shape=(5,),dtype=tf.int64, name='features_input')
model = TFRobertaForSequenceClassification.from_pretrained('saved_models/RoBERTa_loglikes_critic_2.0',output_attentions=True)
model.resize_token_embeddings(len(okenizer))
model.roberta.trainable=False
out1 = model(tokens_input)
dense1 = tf.keras.layers.Dense(10,activation='relu', name='Features_Dense1')(features_input)
dense2 = tf.keras.layers.Dense(6,activation='relu', name='Features_Dense2')(dense1)
dense3 = tf.keras.layers.Dense(2,activation='relu', name='Features_Dense3')(dense2)
concat_layer = tf.keras.layers.Concatenate(name="concatenate")([out1,dense3])
final_layer = tf.keras.layers.Dense(1,name='concatenated_dense2')(concat_layer)
final_model = tf.keras.Model(inputs=[tokens_input,features_input],outputs=final_layer)

What can I do to fix this?
​