"attention_mask" + `pad_token_id

Hi Guys! Can you tell me how can I get rid of those warnings in this code below?

I spend like 2 days already without any success. Even working example of any conversational pipeline would welcome.

  1. The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input’s attention_mask to obtain reliable results.
  2. Setting pad_token_id to eos_token_id:50256 for open-end generation.
    A decoder-only architecture is being used, but right-padding was detected! For correct generation results, please set padding_side='left' when initializing the tokenizer.
from transformers import pipeline, Conversation
pipeline = pipeline(
    "conversational", model=model, tokenizer=tokenizer,
    max_new_tokens=150
)
conversation = Conversation(text = "Going to the movies tonight - any suggestions?")
print(conversation)

print(pipeline(conversation))