The location of script for "Training an Abstractive Summarization Model"

Hi, I have a silly question.

I want to try the scripts on Training an Abstractive Summarization Model.
But somehow I could not find out the github url for that.

I mean I want to run the scripts below on the page.

python \
--mode abstractive \
--model_name_or_path bert-base-uncased \
--decoder_model_name_or_path bert-base-uncased \
--cache_file_path data \
--max_epochs 4 \
--do_train --do_test \
--batch_size 4 \
--weights_save_path model_weights \
--no_wandb_logger_log_model \
--accumulate_grad_batches 5 \
--use_scheduler linear \
--warmup_steps 8000 \
--gradient_clip_val 1.0 \
--custom_checkpoint_every_n 300

Could someone give me the github url?

Thanks in advance.

Hi @kouohhashi, as far as I know the summarisation scripts have been migrated to the seq2seq examples here: transformers/examples/seq2seq at master · huggingface/transformers · GitHub

There you can find BART, T5, and Pegasus, although I suggest starting with Pegasus since it produces decent summaries and is relatively light at ~500M params.

Note that the URL you link to is a different library to transformers: if you have questions about that library I suggest you open an issue in their repo / forum

Thank you.

1 Like