Hi! I’m creating a scrimba tutorial on how students can find models that work with the free inference tier.
This is my current function:
async function isModelInferenceEnabled(modelName) {
const response = await fetch(`https://api-inference.huggingface.co/status/${modelName}`)
const data = await response.json()
return data.state == "Loadable"
}
This correctly eliminates models for which inference is not available or turned off. However, I still ran into an example where this function returned true but the model still required a pro subscription.
Example where Inference works
https://api-inference.huggingface.co/status/espnet/kan-bayashi_ljspeech_vits
{"loaded":false,"state":"Loadable","compute_type":"cpu","framework":"espnet"}
Example where Inference doesn’t work
https://api-inference.huggingface.co/status/suno/bark
{"loaded":false,"state":"Loadable","compute_type":"gpu","framework":"transformers"}
Is there any way to determine if a model requires a Pro subscription?
Thank you!