Parallel HPO when using `trainer.hyperparameter_search()`

One of the features of optuna is its support for asynchronous parallelization of trials across multiple devices (see its doc) . But from my experiences of using trainer.hyperparameter_search(), it seems that different trials are executed one after another. So even though I do have access to multiple devices, I could not leverage them for parallel HPO.

I am wondering if it is possible to use the parallelization feature of optuna. Any input is appreciated.