Hope this is the right place to ask about this. In the documentation it says that the default value for position_biased_input in deberta-v2 is False but in the source above it says that the default value is True.
Will do, actually I have found an additional issue with the position_biased_input. IDK if it is the intended behaviour but setting this to true adds a position embedding layer and therefore a max length for the input. However the tokenizer does not change which could lead to issues