Adapter-transformers vs transformers

I was trying to fine tune roberta-base for a binary classification task, following the steps in Training An Adapter for ROBERTa Model . But I did not reallice that they were using adapter-transformers instead of transformers, and my model was not trained correcly. When I started to use adapter-transformers my metrics improved.
Is it possible to achive same results using transformers?

maybe using peft? cc @smangrul