Potential error in the documentation relating to Deberta-v2 position_biased_input

Hope this is the right place to ask about this. In the documentation it says that the default value for position_biased_input in deberta-v2 is False but in the source above it says that the default value is True.


I could of course be wrong here but I thought it was better to raise this issue

Hi,

That’s indeed a bug, could you open an issue on Github?

Will do, actually I have found an additional issue with the position_biased_input. IDK if it is the intended behaviour but setting this to true adds a position embedding layer and therefore a max length for the input. However the tokenizer does not change which could lead to issues