Embedding Spaces with auth for public models counter-intuitive?

I think I must be missing something. Can someone tell me what?

  1. Embedding spaces is a great option - but only available for public spaces
  2. The gradio instructions even show how to add authentication - awesome!
  3. But as your space is now public (because 1), you have to remember to add your auth credentials in a secret, otherwise anyone can just see it in your code
  4. Also, you have to make your space public to use the Embed option, while you may not have been ready to do so

My take: One of the clear use cases to use ā€œEmbed your spaceā€ is to share the app without sharing the code (e.g. prototype for something that is in stealth), but by only allowing the option for public spaces, that use case won’t fly.

Am I missing something? Is there a better way to do this?

2 Likes

For anyone else reading this: I couldn’t find a good solution this for my situation, so I ended up running my Gradio application on one of the well known PaaS providers. Works quite well with github integration, just had to set a few environment settings.

I do want to stress that I’m still a huge fan of Huggingface, and it may not make sense for the platform to support the ā€œstealthā€ scenario I describe above.

1 Like

Hello @EtheE, thank you for your feedback. We have received similar requests in the past, we might implement a solution it in the future. However, we have previously suggested different workarounds depending on the specific use case:

  • To hide sensitive source code on Space, you can create a private Space and a second public Space which only loads the private Space using the ā€˜gr.Interface.load()’ method. This method can load a Space even if it is private. Check more her Gradio Docs.

  • To hide specific files only, you can create a private dataset/model and use the ā€˜huggingface_hub’ client to load your dataset/model using the hub token via Space settings.

  • You can also use our Inference Endpoints PaaS service which is fully integrated with HF tools and our hub. This service provides a way to deploy a Space as a REST API. You can learn more about it here Inference Endpoints - Hugging Face

Thanks @radames! Even though I searched, I hadn’t come across that first answer! While it introduces a bit of overhead, it’s definitely a good option and something I’ll give another try in the near future.

1 Like

Thanks for the feedback, I’ll update our Space docs