Cannot upload to huggingface space

I’m encountering a problem with creating space after uploading and requirements.txt. For loading the model, I’m using this code: from ultralyticsplus import YOLO, render_result. (I use Gradio to create a website.)

The main task for uploading to the space is that I want the HTML code to be embedded on the Google site.

model_path = ('(my model path on huggingface')
model = YOLO(model_path)

If I use another method to load the model instead of using YOLO, is it possible to fix this error?

The error said

The error said
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/ultralyticsplus/", line 59, in __init__
    self._load_from_hf_hub(model, hf_token=hf_token)
  File "/usr/local/lib/python3.10/site-packages/ultralyticsplus/", line 91, in _load_from_hf_hub
    ) = self._assign_ops_from_task()
  File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/", line 1614, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'YOLO' object has no attribute '_assign_ops_from_task'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/user/app/", line 95, in <module>
  File "/usr/local/lib/python3.10/site-packages/gradio/", line 518, in __init__
  File "/usr/local/lib/python3.10/site-packages/gradio/", line 851, in render_examples
    self.examples_handler = Examples(
  File "/usr/local/lib/python3.10/site-packages/gradio/", line 71, in create_examples
  File "/usr/local/lib/python3.10/site-packages/gradio/", line 298, in create
  File "/usr/local/lib/python3.10/site-packages/gradio_client/", line 889, in synchronize_async
    return fsspec.asyn.sync(fsspec.asyn.get_loop(), func, *args, **kwargs)  # type: ignore
  File "/usr/local/lib/python3.10/site-packages/fsspec/", line 103, in sync
    raise return_result
  File "/usr/local/lib/python3.10/site-packages/fsspec/", line 56, in _runner
    result[0] = await coro
  File "/usr/local/lib/python3.10/site-packages/gradio/", line 360, in cache
    prediction = await Context.root_block.process_api(
  File "/usr/local/lib/python3.10/site-packages/gradio/", line 1695, in process_api
    result = await self.call_function(
  File "/usr/local/lib/python3.10/site-packages/gradio/", line 1235, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/usr/local/lib/python3.10/site-packages/anyio/", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
  File "/usr/local/lib/python3.10/site-packages/anyio/_backends/", line 2144, in run_sync_in_worker_thread
    return await future
  File "/usr/local/lib/python3.10/site-packages/anyio/_backends/", line 851, in run
    result =, *args)
  File "/usr/local/lib/python3.10/site-packages/gradio/", line 692, in wrapper
    response = f(*args, **kwargs)
  File "/home/user/app/", line 24, in detect_objects
    model = YOLO(model_path)
  File "/usr/local/lib/python3.10/site-packages/ultralyticsplus/", line 65, in __init__
    raise NotImplementedError(
NotImplementedError: Unable to load model='MvitHYF/v8mvitcocoaseed2024'. As an example try model='' or model='yolov8n.yaml'

I also ran on VSCode, and everything ran perfectly (run on localhost). However, I encountered this error when trying to create the space. I tried adding to both the model and the space site, but nothing changed. At first, I thought it might fix the error.

Thank you for you help

For those who are facing this problem like me. I find the solution. If you want to deploy your ML model to the huggingface’s space, you need to upload requirement.txt. Yes, requirement.txt is the problem. We don’t need to specify the version for each library. For example, gradio==4.22.0, you can write “gradio” instead of a specific version.

Hope this help