MathCoder2: Enhancing Mathematical Reasoning in Large Language Models by Continued Pretraining on Model-Translated Mathematical Code
This paper introduces MathCoder2, a family of large language models (LLMs) with enhanced mathematical reasoning abilities achieved through a novel continued pretraining method using model-translated mathematical code paired with natural language reasoning steps.