LLAMA-2 Multi-Node

Will LLAMA-2 benefit from using multiple nodes (each with one GPU) for inference?

Are there any examples of LLAMA-2 on multiple nodes for inference?