Bibliographic Information: Liu, L., Cai, L., Zhang, C., Zhao, X., Gao, J., Wang, W., ... & Li, Q. (2023). LinRec: Linear Attention Mechanism for Long-term Sequential Recommender Systems. In Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR ’23) (pp. 1-11).
Research Objective: This paper addresses the computational challenges of traditional dot-product attention mechanisms in Transformer-based sequential recommender systems (SRSs) when dealing with long-term user interaction sequences. The authors aim to develop a more efficient attention mechanism that maintains high accuracy for long-term SRSs.
Methodology: The authors propose LinRec, an L2-normalized linear attention mechanism, which reduces the computational complexity of attention from O(N^2) to O(N), where N is the sequence length. LinRec achieves this by modifying the standard dot-product attention through three key changes: changing the dot-product order, using row-wise and column-wise L2 normalization for Query and Key matrices, and adding an ELU activation layer. The authors theoretically analyze LinRec's effectiveness and efficiency, demonstrating its ability to preserve the essential properties of attention mechanisms while significantly reducing computational cost.
Key Findings: Extensive experiments on two public benchmark datasets (ML-1M and Gowalla) demonstrate that LinRec, when integrated with various Transformer-based recommender models, achieves comparable or even superior performance compared to state-of-the-art methods, while significantly reducing time and memory consumption.
Main Conclusions: LinRec offers a practical and effective solution for enhancing the efficiency of Transformer-based SRSs, particularly for long-term sequential recommendation tasks. Its linear complexity and ability to maintain high accuracy make it a promising approach for real-world applications where long user interaction sequences are prevalent.
Significance: This research significantly contributes to the field of sequential recommendation by addressing the computational bottleneck of traditional attention mechanisms in handling long-term sequences. LinRec's efficiency and effectiveness pave the way for developing more scalable and accurate SRSs for various applications.
Limitations and Future Research: While LinRec demonstrates promising results, further investigation into its performance on even larger datasets and with different Transformer architectures is warranted. Exploring the potential of combining LinRec with other efficiency-enhancing techniques could further improve the scalability and accuracy of long-term SRSs.
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Langming Liu... at arxiv.org 11-05-2024
https://arxiv.org/pdf/2411.01537.pdfDeeper Inquiries