gpt-neo-2.7B isn't working with pipleline

I’m getting a basic error when I try to access GTP-NEO-2.7B via a pipeline. Working from my local machine, I go through the exact stages on the relevant model card page, namely:

from transformers import pipeline
generator = pipeline('text-generation', model='EleutherAI/gpt-neo-2.7B')

However, when I do so I get the following error:

OSError: Can't load config for 'EleutherAI/gpt-neo-2.7B'. Make sure that:

- 'EleutherAI/gpt-neo-2.7B' is a correct model identifier listed on 'https://huggingface.co/models'

- or 'EleutherAI/gpt-neo-2.7B' is the correct path to a directory containing a config.json file

Can anyone advise what’s happening here? Thanks.

I think I had the same problem. I went to the Huggingface site and copied the identifier from there and pasted it into my notebook. Then it worked. Even though the spelling seemed correct, it fixed the problem. I’m not sure why.