How do I disable gradient syncing between workers in distributed training when using trainer?

I am exploring asych training and want customized behavior where the gradients are not sycned automatically. Is this possible under the trainer infra?