Quality-Guided Contrastive Rationale Distillation for Enhancing Reasoning Capabilities of Smaller Language Models
A novel framework called Quality-Guided Contrastive Rationale Distillation (QCRD) that enhances the reasoning capabilities of smaller language models by effectively distilling both positive and negative knowledge from large language models through contrastive learning.