Flexible and Effective Mixture of Large Language Models with Domain-Specialized Experts
Enabling rapid and low-cost creation of Mixture-of-Domain-Experts (MOE) language models by mixing a source model with pre-trained, domain-specialized expert models.