Pipeline inference with multi gpus

Hello, with the pipeline object, is it possible to perform inferences with my 2 gpus at the same time ?

What I would like is something like:

    out = pipe(
        input,
        batch_size=batch_size,
        n_gpus=2 # <- Is there an equivalent to this argument ?

    )
1 Like