Out of Memory Bert fine-tuning


We are trying to fine-tune a Bert based model named “AlephBERT”, on a local machine with 6GB vRAM and we get "CUDA out of memory. " error.

We found in the docs that we might need to change max_seq_length, but we do not find how to change that parameter according to huggingface’s docs.
Any help and explanation is much appreciated

Thank you!