When the error happens during training using Seq2SeqTrainer
object, the stacktrace is limited to 80 characters and has this kind of fancy formatting.
╭───────────────────── Traceback (most recent call last) ──────────────────────╮
│ /lustrefs/disk/home/nhongcha/hf-caption/train.py:96 in <module> │
│ │
│ 93 │ with open(config_path, "r", encoding="utf8") as f: │
│ 94 │ │ config = json.load(f) │
│ 95 │ │
│ ❱ 96 │ model = CachedFeatureDecoderModel( │
│ 97 │ │ DINOPretrained(), │
│ 98 │ │ AutoModelForCausalLM.from_pretrained(text_decode_model) │
│ 99 │ ) │
│ │
│ /project/lt200060-capgen/palm/conda_envs/.conda/envs/palm_caption/lib/python │
│ 3.8/site-packages/transformers/models/vision_encoder_decoder/modeling_vision │
│ _encoder_decoder.py:175 in __init__ │
│ │
│ 172 │ │ │ config = VisionEncoderDecoderConfig.from_encoder_decoder_c │
│ 173 │ │ else: │
│ 174 │ │ │ if not isinstance(config, self.config_class): │
│ ❱ 175 │ │ │ │ raise ValueError(f"Config: {config} has to be of type │
│ 176 │ │ │
│ 177 │ │ if config.decoder.cross_attention_hidden_size is not None: │
│ 178 │ │ │ if config.decoder.cross_attention_hidden_size != config.en │
╰──────────────────────────────────────────────────────────────────────────────╯
Which looks cool but absolutely useless when you try to solve the issue and the code has 5-6 indents. Especially when the monitor is bigger than 640x480.
I’ve already looked at this python - How to make typer traceback look normal - Stack Overflow but it doesn’t help. Specifically export _TYPER_STANDARD_TRACEBACK=1
and export TYPER_STANDARD_TRACEBACK=1
.
My hunch is it may be about Huggingface but maybe something else like datasets
or evaluate
but I can’t find anything useful so far.
How to make stacktrace print everything it need to again?
Edit
Specifically I want it to look like this
Traceback (most recent call last):
File "<input>", line 1, in <module>
Exception
instead of
╭───────────────────── Traceback (most recent call last) ──────────────────────╮
╰──────────────────────────────────────────────────────────────────────────────╯