Distributed GPU training not working

If using multinode you need a config file on each node, one with rank 0 one with rank 1