Two transformers in one model

Say I want to train a giant (keras) model with 2 transformers in it. I would need to feed it two separate token vectors and two separate attention masks.

Can I set the expected input names to something other than “input_ids” and “attention_masks” to disambiguate the inputs?