Unable to use ZeroGPU/T4 in spaces on tensorflow model

I am able to run my inferencing on spaces using CPU. However, when I tried using ZeroGPU/T4 I am unable to detect the GPU with Cuda initialization errors. How do I fix such that I am able to use GPU?

import spaces # When importing, write at the beginning of the file.

@spaces.GPU(duration=60)
def infer():
    inference()

Specifying a decorator in this way allows access to the GPU only from within the function that specifies the decorator.
The key to ingenuity is how to keep this function only momentary.

However, there are currently multiple technical problems with the Zero GPU, some of which have been repaired, but simply doing so as usual may not work.
In such cases, you can open a Discussion with the following community for support.

Hi John,
I’ve specified the decorator as per your answer which didn’t really work. I will take a look at the community in your link. Aside from that, I’ve also tried running the inferencing on the T4 GPU but it still could not detect the GPU when I run tf.config.list_physical_devices(‘GPU’)

Hi.
I don’t know if this is a bug or a spec, since I am actually new to HF, but the detection of CUDA devices by torch.cuda.available() only returns true in global scope or in functions with @spaces decorator.
On the other hand, there is rarely a case where it does not return under these conditions, and if it did, I think it would be a rare bug indeed.

Oh, Tensor flow. I’ve never used it directly, so I don’t know all the details of how it works.