Two errors (Xformers not installed correctly; 'RWForCausalLM' not supported) on my first attempt

I followed the installation guide successfully yesterday, and got the sentimente-analysis test to work. Then I set the objective of following the “How to Get Started” code on this card (tiiuae/falcon-7b-instruct).

It all went fine as far as downloading the shards, but then I got two errors:

Xformers is not installed correctly. If you want to use memorry_efficient_attention to accelerate training use the following command to install Xformers
pip install xformers.
The model 'RWForCausalLM' is not supported for text-generation. Supported models are ['BartForCausalLM', 'BertLMHeadModel', 'BertGenerationDecoder', 'BigBirdForCausalLM', 'BigBirdPegasusForCausalLM', 'BioGptForCausalLM', 'BlenderbotForCausalLM', 'BlenderbotSmallForCausalLM', 'BloomForCausalLM', 'CamembertForCausalLM', 'CodeGenForCausalLM', 'CpmAntForCausalLM', 'CTRLLMHeadModel', 'Data2VecTextForCausalLM', 'ElectraForCausalLM', 'ErnieForCausalLM', 'GitForCausalLM', 'GPT2LMHeadModel', 'GPT2LMHeadModel', 'GPTBigCodeForCausalLM', 'GPTNeoForCausalLM', 'GPTNeoXForCausalLM', 'GPTNeoXJapaneseForCausalLM', 'GPTJForCausalLM', 'LlamaForCausalLM', 'MarianForCausalLM', 'MBartForCausalLM', 'MegaForCausalLM', 'MegatronBertForCausalLM', 'MvpForCausalLM', 'OpenLlamaForCausalLM', 'OpenAIGPTLMHeadModel', 'OPTForCausalLM', 'PegasusForCausalLM', 'PLBartForCausalLM', 'ProphetNetForCausalLM', 'QDQBertLMHeadModel', 'ReformerModelWithLMHead', 'RemBertForCausalLM', 'RobertaForCausalLM', 'RobertaPreLayerNormForCausalLM', 'RoCBertForCausalLM', 'RoFormerForCausalLM', 'RwkvForCausalLM', 'Speech2Text2ForCausalLM', 'TransfoXLLMHeadModel', 'TrOCRForCausalLM', 'XGLMForCausalLM', 'XLMWithLMHeadModel', 'XLMProphetNetForCausalLM', 'XLMRobertaForCausalLM', 'XLMRobertaXLForCausalLM', 'XLNetLMHeadModel', 'XmodForCausalLM'].

And when I tried pip install xformers, that also failed:

(.env) (base) ?130 hugface % pip install xformers                                                                                                         21:38:28
Collecting xformers
  Downloading xformers-0.0.20.tar.gz (7.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.6/7.6 MB 7.7 MB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [17 lines of output]
      Traceback (most recent call last):
        File "/Users/jstrout/Data/PyTorch/hugface/.env/lib/python3.8/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/Users/jstrout/Data/PyTorch/hugface/.env/lib/python3.8/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/Users/jstrout/Data/PyTorch/hugface/.env/lib/python3.8/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
        File "/private/var/folders/_m/4n2mqlw50pq6wrv48pm2m47w0000gn/T/pip-build-env-d141ztpr/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 341, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
        File "/private/var/folders/_m/4n2mqlw50pq6wrv48pm2m47w0000gn/T/pip-build-env-d141ztpr/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 323, in _get_build_requires
          self.run_setup()
        File "/private/var/folders/_m/4n2mqlw50pq6wrv48pm2m47w0000gn/T/pip-build-env-d141ztpr/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 487, in run_setup
          super(_BuildMetaLegacyBackend,
        File "/private/var/folders/_m/4n2mqlw50pq6wrv48pm2m47w0000gn/T/pip-build-env-d141ztpr/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 338, in run_setup
          exec(code, locals())
        File "<string>", line 23, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

In case it helps, here’s the output of transformers-cli env:

- `transformers` version: 4.29.2
- Platform: macOS-10.15.7-x86_64-i386-64bit
- Python version: 3.8.8
- Huggingface_hub version: 0.14.1
- Safetensors version: not installed
- PyTorch version (GPU?): 2.0.1 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: <not AFAIK>
- Using distributed or parallel set-up in script?: <highly doubt it>

I’m quite new at all this and don’t know what to try next. Any suggestions will be greatly appreciated.

2 Likes

for xformers issue you can simply run pip install xformers but I’m more curious about the other error about RWForCausalLM I wonder how to fix this.

@penthoy

for xformers issue you can simply run pip install xformers

No, he/we can’t. Did you see his message? pip install xformers is exactly the command he ran, shown in his message. The one that produces the error… Running pip install xformers is what produces the error. :man_shrugging:

Found this thread because I’m facing the same error when I run pip install xformers. Haven’t found any solutions. Someone suggested using github source,

pip install git+https://github.com/facebookresearch/xformers.git@main#egg=xformers

…but still the No module named ‘torch’ error. Even though torch is installed and Python can import it.

3 Likes

Did you find a solution to this problem? It started showing up suddenly for me today.

1 Like