How to create custom optimizer and how does train_args' optimizer interact with custom optimizer

Currently I am trying to adding a custom optimizer to the train function by feeding an adamW and lr_scheduler to the Seq2SeqTrainer. However, I’ve seen that Seq2SeqTrainer has an optimizers argument and Seq2SeqTrainingArguments also has a default optimizer. I wonder how do they interact?
Moreover, I also want to let the model to iterate over multiple datasets in the training process. I wonder is that possible?