Use transformer without position embeddings being added?

Hi,

I can’t seem to find any option to disable position embeddings being added in models like BERT and GPT2. I’d like to use such a model for a non NLP task where position is irrelevant. Is it possible to do this in Hugging Face?

Thanks

1 Like