Loading weight-tied weights from safetensors appears to be broken

I have a model that I trained and pushed to the hub where I weight-tied certain weights. So then when I pushed to the hub only one copy of each tensor gets loaded into the safetensors. However, when I init AutoModel.from_pretrained using this model the weights are no longer equal and they are no longer tied. Please let me know what the best practice is here.

The mode in question is this one: kuleshov-group/caduceus-ps_seqlen-131k_d_model-256_n_layer-16 · Hugging Face

1 Like