Improving Listwise Ranking in Large Language Models through Permutation Self-Consistency
Permutation self-consistency, a novel decoding technique, can improve the quality, consistency, and position invariance of listwise ranking in black-box large language models.