How to change the batch size of a pretrained model?

Hey friends! Just wondering a simple question

Given a pretrained model, can I change the batch size?


from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("gpt2")

model = AutoModelForCausalLM.from_pretrained("gpt2")

Cant find any documentation on how to do so, thanks!

Hi @alpyne,

Did you find an answer for this?