Unable to use Hugging Face Inference API for freshcodestech/LingoSpace (widget active)

Hi Hugging Face Support / Team,

I’m unable to run inference against my model freshcodestech/LingoSpace using the Inference API. The model repo and its widget are active in my account, but requests return an error (see details below). I created this pretrained model from the base model stabilityai/stable-diffusion-xl-base-1.0.

Thanks in advance — I appreciate any guidance to get this model serving properly.

1 Like

True. I got also 500 error. @michellehbn

from huggingface_hub import InferenceClient

HF_TOKEN = "hf_***my_read_token***"

client = InferenceClient(
    provider="fal-ai",
    api_key=HF_TOKEN,
)

image = client.text_to_image(
    "Astronaut riding a horse",
    model="freshcodestech/LingoSpace",
)
# huggingface_hub.errors.HfHubHTTPError: 500 Server Error: Internal Server Error for url: https://router.huggingface.co/fal-ai/fal-ai/fast-sdxl (Request ID: Root=1-68b81183-661eb210474151ba1d4ab09b;43ffed08-93ce-43cc-afd7-7fa1240c65b6)