Core Concepts
This paper proposes a novel pre-trained sequential recommendation framework, PrepRec, that can achieve zero-shot cross-domain and cross-application transfer without any auxiliary information.
Abstract
The paper presents a novel pre-trained sequential recommendation framework called PrepRec that aims to address the challenge of zero-shot cross-domain and cross-application sequential recommendation without any auxiliary information.
Key highlights:
PrepRec learns universal item and sequence representations by modeling item popularity dynamics, instead of learning representations specific to each item ID or using application-dependent auxiliary information.
The universal item representations are learned based on the changes in item popularity over different time horizons (long-term and short-term).
A popularity dynamics-aware transformer architecture is proposed to learn universal sequence representations.
PrepRec can achieve zero-shot cross-domain and cross-application transfer without any auxiliary information, setting a baseline for pre-trained sequential recommenders.
Extensive experiments show that PrepRec can outperform state-of-the-art sequential recommender models, especially for long-tail items, through a simple post-hoc interpolation.
Stats
The marginal distribution of user and item activities is universal and heavy-tailed across datasets.
The popularity dynamics of items are crucial for predicting user behavior.