I’m a newbie to this and a newbie to python as well. I have no idea what CUDA is, tried looking up but cannot find any solution.
I tried setting up in Pycharm and Windows 11. Already installed Tensorflow 2.0, transformers, torch using pip. I downloaded Llama from huggingface and I get this error when I try to run. I have also installed bitsandbytes through pip.
The folder path in my code is set this way - bot = Llama3(“C:/Users/RAH/PycharmProjects/LLM/meta-llama/”)
Could this be a folder path issue or an issue with CUDA? Within the folder “C:/Users/RAH/PycharmProjects/LLM/meta-llama” I have 2 sub folders - Llama-3.3-70B-Instruct and Meta-Llama-3-8B-Instruct. Should I be pointing to one of those folders in the code as opposed to the higher level folder?
*** Error Message Below ***
File “C:\Users\RAH\PycharmProjects\LLM.venv\Lib\site-packages\transformers\integrations\bitsandbytes.py”, line 537, in _validate_bnb_cuda_backend_availability
raise RuntimeError(log_msg)
RuntimeError: CUDA is required but not available for bitsandbytes. Please consider installing the multi-platform enabled version of bitsandbytes, which is currently a work in progress. Please check currently supported platforms and installation instructions at Installation Guide