Training llama2-7b-chat, is my model overfitting? i think my model is not learning anything? how to better train?

Screenshot 2024-04-15 at 11.09.55 PM

I am training a llama2-7b-chat with the SFTTrainer, but my losses look like the curves above. I don’t think my model is learning anything…? Any insights on how to improve/fix this?

My config is the following:
lora_r = 8
lora_alpha = 8

lr_scheduler_type = “constant”
num_train_epochs = 3
training_data = 900
use_4bit = True
max_seq_length = 2048
collate_data = True

I have also tried other combinations of lora_r and lora_alpha and i end up getting the graph similar to the one attached.

I am using 900 training instances.

Any help will be appreciated, thanks!

Your eval loss decreasing with your train loss seems like a good sign, but if you want to check for overfitting, I think you might also want to test your model to see how it responds to different inputs. I don’t think loss plateauing like that is hugely abnormal, but loss by itself might not give you enough information to detect model overfitting.

1 Like

Large Lemming Movements always overfit, can’t explain themselves, can’t reason — LLMs are “stochastic parrots.”

It seems like your training and eval loss are converging, which is usually a good sign. I do have a few questions, though:
Are you separating your dataset into train and test subsets? This is probably a silly question but to get an accurate reading in the evaluation you need to make sure that you’re giving your model examples that it hasn’t seen before and you also need to make sure that when you run them through your model that it isn’t training on them.
Another question I have is whether or not you’ve configured dropout. If you’re concerned with over fitting then configuring a low, non-zero dropout will help to prove that the model isn’t over fitting by randomly dropping out nodes during training, and you can do this during inference time as well, though small models will suffer from lower quality results using this technique.

Hope that helps!