Calling Inference API for text embedding

Up until recently I was able to call the Inference API (i.e. https://api-inference.huggingface.co/models/intfloat/e5-large-v2) on embedding models found on the MTEB leaderboard MTEB Leaderboard - a Hugging Face Space by mteb.

However now it seems like all those models were transferred and tagged as “Sentence Similarity” models, so the API doesn’t work with a query of “inputs”. It needs a “source_sentence” and “sentences”.

Any idea why this change happened? And how I can start using the Inference API for text embedding again?

Resolved! See Can one get an embeddings from an inference API that computes Sentence Similarity? - #5 by osanseviero.