Hey Guys, I got unbelievable Error.
ValueError: You have set args.eval_strategy
to steps but you didn’t pass an eval_dataset
to Trainer
. Either set args.eval_strategy
to no
or pass an eval_dataset
.
My training code is:
training_args = TrainingArguments(
output_dir="/kaggle/working/twitter-sentiment-analysis-llm",
report_to="wandb",
learning_rate=2e-5,
per_device_train_batch_size=16,
per_device_eval_batch_size=16,
num_train_epochs=3,
weight_decay=0.01,
eval_strategy="steps",
save_strategy="steps",
load_best_model_at_end=True,
eval_steps=500,
save_steps=500,
logging_steps=10,
fp16=True,
)
trainer = Trainer(
model=model,
args=training_args,
train_dataset=train_ds['train'],
eval_dataset=train_ds['test'],
tokenizer=tokenizer,
data_collator=data_collator,
compute_metrics=compute_metrics,
)
trainer.train()
I checked eval_dataset and passed it exactly. But I got an error after all training steps.
Guys, Please Help me!!
1 Like
There seem to be several patterns… I hope it’s not a new bug.
I’m trying to do a finetuning without an evaluation dataset.
For that, I’m using the following code:
training_args = TrainingArguments(
output_dir=resume_from_checkpoint,
evaluation_strategy="epoch",
per_device_train_batch_size=1,
)
def compute_metrics(pred: EvalPrediction):
labels = pred.label_ids
preds = pred.predictions.argmax(-1)
f1 = f1_score(labels, preds, average="weighted")
acc = accuracy_score(labels, preds, average="…
You can avoid this error by not specifying evaluation_strategy
in TrainingArguments
.
Thanks for your comment.
You’re right, but I 'd like to use evaluation step.
1 Like
+1
Any workarounds yet?
UDP: In my case it was a problem with dataset separation. Try len() both of your train and validation dataset
1 Like
Should we raise an issue on the github for the datasets library?
It’s not clear whether it’s a bug or not, but at the very least the error message is not appropriate.
1 Like
Had the same issue, was mentioned here and fixed here . pip install transformers==4.47.1
fixed it for me
2 Likes
system
Closed
December 19, 2024, 9:06am
7
This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.