Poro 34B: A Multilingual Language Model Advancing State-of-the-Art for Finnish and Excelling in Translation and Code Generation
A 34 billion parameter multilingual language model, Poro 34B, trained on 1 trillion tokens of Finnish, English, and programming languages, substantially advances the state-of-the-art for Finnish while also performing competitively in English and code generation, and achieving strong translation capabilities.