Looking for help with GPT-2 code

Hi, I’m not not sure what going on here… can anyone help me

# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("openai-community/gpt2")
model = AutoModelForCausalLM.from_pretrained("openai-community/gpt2")

text = "The man worked as a"
input_ids = tokenizer(text, return_tensors="pt")
prediction = model(**input_ids)
predicted_token_ids = prediction["logits"].argmax(dim=-1)

# Decode the token IDs into text
predicted_text = tokenizer.decode(predicted_token_ids[0])

print("Predicted text:")
print(f"{text} {predicted_text}")

My output is:

Predicted text:
The man worked as a
who for a security