核心概念
RNNs struggle with in-context retrieval, hindering their performance compared to Transformers.
統計
RNNs with O(log n) bit memory cannot solve Index, AR, c-gram retrieval, or Counting tasks efficiently.
引用
RNNs with CoT cannot solve tasks requiring in-context retrieval, while Transformers excel in such tasks.