Hey Niels thanks for the help.
Obviously, I am not the owner of those 20+ repositories I just want to load existing models in the hub. Are the models corrupted beyond repair? If so, how do we remove them from the hub and avoid corrupted models being uploaded in the future? If not, how can they be loaded (e.g. if only the config is missing, can’t we assume it is just like the default t5-small config)?
Also trying to supply a config doesn’t seem to change anything (but I might be doing it the wrong way.
Tried:
from transformers import AutoModelForSeq2SeqLM
AutoModelForSeq2SeqLM.from_pretrained(‘SvPolina/t5-small-finetuned-CANARD’, config=‘t5-small’)
For future reference, models that just fail loading (from pytorch, with and without from_tf) I assume jax ones could be loaded in some way ASP, the others seem to be problematic.
jax_models = [“aqj213/t5-small-pisa-state-only-finetuned”, “shivam12/t5_small_pubmed”]
not_working = [
“SvPolina/t5-small-finetuned-CANARD”, “Edwardlzy/t5-small-finetuned-xsum”, “Teepika/t5-small-finetuned-xsum”,
“HuggingLeg/t5-small-finetuned-xsum”, “V3RX2000/t5-small-finetuned-xsum”, “Teepika/t5-small-finetuned-xsum-glcoud”,
“VenkateshE/t5-small-finetuned-xsum”, “Wusgnob/t5-small-finetuned-xsum”, “HugoZhu/t5-small-finetuned-xsum”,
“Zazik/t5-small-finetuned-xsum”, “Paramveer/t5-small-finetuned-xsum”, “arkosark/t5-small-finetuned-xsum”,
“RamadasK7/t5-small-finetuned-squad”, “bochaowei/t5-small-finetuned-cnn-wei2”, “Kyaw/t5-small-finetuned-xsum”,
“ggosline/t5-small-herblables”,“Luckyseeker/t5-small-finetuned-xsum”,“umarayub/t5-small-finetuned-xsum”,
“yougang/t5-small-finetuned-xsum”,“xikoto/t5-small-finetuned-xsum”, “vhvk99/t5-small-finetuned-xsum”,
“tsosea/t5-small-finetuned-xsum”,“tharik/t5-small-finetuned-xsum”,“malay-huggingface/t5-small-abstractive-summarization-bahasa-cased”, “Alifarsi/t5-small-finetuned-xsum”,
“heejun/t5-small-finetuned-xsum”,
“MHJ/t5-small-finetuned-xsum”,
“kroshan/t5-small-finetuned-xsum”,
“E312/t5-small-finetuned-xsum”,
“knkarthick/t5-small-finetuned-xsum”,
]