When i want to use tensor parallelism during the model inference , I find the parallelism is supported on training. How to customize tensor parallelism?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Tensor parallelism for customized model | 0 | 181 | September 2, 2024 | |
Data Parallelism for multi-GPUs Inference | 0 | 524 | October 26, 2022 | |
Trainer.predict in parallel not supported! | 2 | 644 | November 2, 2022 | |
Model *and* data parallelism when training on multiple GPUs? | 0 | 17 | January 22, 2025 | |
How can I use Inference API with my model? | 0 | 146 | February 24, 2024 |