LibMoE: A Comprehensive Benchmarking Library for Mixture of Experts in Large Language Models
LibMoE is a new library designed to streamline the research and development of Mixture of Experts (MoE) algorithms in Large Language Models (LLMs) by providing a standardized and accessible framework for training, evaluating, and analyzing their performance.