I am facing same issue but with LLMs I am inferencing Llama2 model using dataloaders and process gets stuck in the 2-gpu setup
1 Like
I am facing same issue but with LLMs I am inferencing Llama2 model using dataloaders and process gets stuck in the 2-gpu setup