If I ask hugging face to add some transformer model, what will hugging face do?

I have great ideas to speedup the inference time of many transformer architectures.

But I am not a machine learning researcher.

If I ask some hugging face’s employee to add some models, what will hugging face do?

(I HAVE ALREADY EMAILED TO THE AUTHORS’ CODE BUT THEY DON’T ANSWER)

I am talking about the models:

  1. CompuBERT:

Article: “Ensembling Ten Math Information Retrieval Systems”
(http://ceur-ws.org/Vol-2936/paper-06.pdf)

Code: ARQMath solution by Michal Štefánik (0.264 nDCG') - Google Drive

  1. ALBERT (for math):

Article: “TU_DBS in the ARQMath Lab 2021, CLEF”

  1. ELBERT:

Article: “Elbert Fast Albert with Confidence-Window Based Early Exit”

  1. LISA (LInear-time Self Attention)

Article: “Linear-Time Self Attention with Codeword Histogram for Efficient Recommendation”

Note: All codes are available on github for models 2, 3 and 4.
(this site does not allow more than two links per post)

What should I do for some hugging face employee to add at least 2 models I need?
(As for me $9 is expensive, so I can’t afford too much more than that price…)

Thanks in advance.