Easily build your own MoE LLM!

In mergoo, you can easily build your own MoE LLM by integrating the knowledge of multiple open-source LLM experts.

:rocket: In mergoo:

  • Supports Mixture-of-Experts, Mixture-of-Adapters (new feature), and Layer-wise merge
  • Efficiently train your MoE-style merged LLM, no need to start from scratch
  • Compatible with Hugging Face :hugs: Models and Trainers

Checkout our Hugging Face blog: Mergoo: Efficiently Merge, then Fine-tune (MoE, Mixture of Adapters)
mergoo: GitHub - Leeroo-AI/mergoo: A library for easily merging multiple LLM experts, and efficiently train the merged LLM.