New: Distributed GPU Platform

  1. usually runpod, lambda, or whomever else

  2. cost

  3. i train large scale open source(and closed source) models for general performance in generative tasks, usually llms

  4. i am the drain by which the compute falls, the final destroyer of water (doing my best to fix that though!)

  5. if you mean trainers here, we have our own that weve built (axolotl, openchat)

  6. make it, usually - store on huggingface!

1 Like