Distilling Large-Scale Comparative Knowledge from Language Models
We introduce NeuroComparatives, a novel framework for distilling large-scale, high-quality comparative knowledge from language models at different scales, producing a corpus of up to 8.8 million comparisons over 1.74 million entity pairs - 10x larger and 30% more diverse than existing resources.