im getting this message even from model page:
Failed to perform inference: an occurred while streaming the response: Model doesn’t exist: google/medgemma-4b-it
What should i do to solve this problem?
im getting this message even from model page:
Failed to perform inference: an occurred while streaming the response: Model doesn’t exist: google/medgemma-4b-it
What should i do to solve this problem?
I reported it to Discord for now.
I’m having the same problem, I thought it was me, but I just realized that it seems to be a general problem, maybe some news about that error?
maybe some news about that error?
Not yet.
Sorry, I’m new, maybe there is another way to report this error?
another way to report this error?
Email is the official channel, but there are multiple email addresses, and I cannot confirm whether this is for bug reports. Additionally, the issue section of the hub_docs GitHub repository can also be used for general Hub error reports. website@huggingface.co feedback@huggingface.co ,…
Thanks for reporting! We’re taking a look and I’ll update soon.
Thanks always, Megan!
Hi everyone, Thanks for waiting - this should now be working as expected though let us know if not the case! Have a great day
Thank you!