Facebook BART Fine-tuning - Transformers CUDA error: CUBLAS_STATUS_NOT_INITIALIZE

hi @LidorPrototype ,

First, check your own dataset will match with facebook/bart-base 's vocab size.
Second, check word that in your own dataset is in model’s vocabulary.
Third, check model, dataloader, tokenizer config is correct. There many mistake are config’s wrong path or config setting is wrong.

As you see this article, if your own dataset contain some data that dose not support at model’s support vocab, you can face that error.

If you want to check detail, please debug some easy code like docs example code.
Follow line with your own dataset, you can see trackback message more detail.
(Some data not contain vocab, mismatch config, etc…)

bart docs page

from transformers import AutoTokenizer, BartModel
import torch

tokenizer = AutoTokenizer.from_pretrained("facebook/bart-base")
model = BartModel.from_pretrained("facebook/bart-base")

inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)

last_hidden_states = outputs.last_hidden_state
1 Like