Enhancing Japanese Language Capabilities of Large Language Models through Continual Pre-Training
Continual pre-training of large language models initially trained on English corpora can effectively enhance their Japanese language capabilities, outperforming Japanese language models trained from scratch.