I have a well working classifier and wanted to try the roc_auc score as a metric. Weirdly it seems that this breaks the Code. Does anyone know why?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Logits to probability conversion for compute_metric() during finetuning using Trainer class | 0 | 1124 | September 9, 2022 | |
Confusion matrix , roc_auc curve | 1 | 2576 | September 6, 2022 | |
AggregateScore error when computing metric | 0 | 85 | June 2, 2024 | |
Eval_pred vs. EvalPrediction confusion | 0 | 837 | August 5, 2023 | |
ValueError: Classification metrics can't handle a mix of multilabel-indicator and multiclass targets | 0 | 1030 | December 8, 2022 |