How to set batchsize of inference

hi @allenwang37
I don’t know if this answers your question:

If you have multiple GPUs:

1 Like