Calling Inference API for text embedding

Up until recently I was able to call the Inference API (i.e. /static-proxy?url=https%3A%2F%2Fapi-inference.huggingface.co%2Fmodels%2Fintfloat%2Fe5-large-v2%3C%2Fa%3E) on embedding models found on the MTEB leaderboard MTEB Leaderboard - a Hugging Face Space by mteb.

However now it seems like all those models were transferred and tagged as “Sentence Similarity” models, so the API doesn’t work with a query of “inputs”. It needs a “source_sentence” and “sentences”.

Any idea why this change happened? And how I can start using the Inference API for text embedding again?

Resolved! See Can one get an embeddings from an inference API that computes Sentence Similarity? - #5 by osanseviero.