핵심 개념
개발된 Apollo은 61억 인구를 대상으로 하는 최첨단 다국어 의료 LLMs를 제공합니다.
통계
Released Apollo models, at various relatively-small sizes (i.e., 0.5B, 1.8B, 2B, 6B, and 7B), achieve the best performance among models of equivalent size.
Especially, Apollo-7B is the state-of-the-art multilingual medical LLMs up to 70B.
The released Apollo models are open-source and include training corpora, code, model weights, and evaluation benchmark.
인용구
"Despite the vast repository of global medical knowledge predominantly being in English, local languages are crucial for delivering tailored healthcare services."
"Apollo-7B is the state-of-the-art multilingual medical LLMs up to 70B."