The authors propose CoT-BERT, a novel approach that integrates Chain-of-Thought reasoning with text representation tasks. By introducing an extended InfoNCE Loss and refined template denoising method, CoT-BERT achieves state-of-the-art performance without relying on external components.
CoT-BERT proposes a two-stage approach for sentence representation, leveraging Chain-of-Thought to enhance unsupervised learning without external components.