Getting an error in AutoTrain

Anyone else also getting this error? I’m new to this and tried setting up in Pycharm and Windows 11. Already installed Tensorflow 2.0, transformers, torch using pip. I downloaded Llama from huggingface and I get this error when I try to run.

The folder path in my code is set this way - bot = Llama3(“C:/Users/RAH/PycharmProjects/LLM/meta-llama/”)

Could this be a folder path issue or an issue with CUDA? Within the folder “C:/Users/RAH/PycharmProjects/LLM/meta-llama” I have 2 sub folders - Llama-3.3-70B-Instruct and Meta-Llama-3-8B-Instruct. Should I be pointing to one of those folders in the code as opposed to the higher level folder?

*** Error Message Below ***
File “C:\Users\RAH\PycharmProjects\LLM.venv\Lib\site-packages\transformers\integrations\bitsandbytes.py”, line 537, in _validate_bnb_cuda_backend_availability
raise RuntimeError(log_msg)
RuntimeError: CUDA is required but not available for bitsandbytes. Please consider installing the multi-platform enabled version of bitsandbytes, which is currently a work in progress. Please check currently supported platforms and installation instructions at Installation Guide

1 Like