This paper proposes a Retrieval-Augmented Transformer (RAT) model that enhances fine-grained intra- and cross-sample feature interactions to improve click-through rate prediction.
BAHE, a novel hierarchical encoding approach, decouples the representation extraction of atomic behaviors from the learning of behavior interactions, significantly improving the efficiency and effectiveness of LLM-based CTR prediction with long user sequences.