Text Generation using GPT2

Please use this code for the text generation. make sure you have transformer and torch is installed.

  • pip install torch
  • pip install transformers
import torch
from transformers import GPT2Tokenizer, GPT2LMHeadModel

tokenizer = GPT2Tokenizer.from_pretrained("gpt2")
model = GPT2LMHeadModel.from_pretrained("gpt2")
generated = tokenizer.encode("The Manhattan bridge", return_tensors='pt')
output_sequences = model.generate(
    input_ids=generated,
    max_length=150,  
    num_return_sequences=1,
    no_repeat_ngram_size=2,
    temperature=0.7,
    top_k=50,
    top_p=0.95,
    pad_token_id=tokenizer.eos_token_id
)

text = tokenizer.decode(output_sequences[0], skip_special_tokens=True)
print(text)

Note: You can fine tune the parameters for the best answer specially temperature.
Hope at least you got answer even after 3 years :smile:
I just look your post and thought to answer you.