Contrastive pre-training with UBM enhances user behavior understanding in e-commerce sessions. The model captures intra-item semantic relations, inter-item connections, and inter-interaction dependencies effectively. Extensive experiments demonstrate superior performance compared to baselines on purchase intention prediction, remaining length prediction, and next item prediction tasks.
Session data in e-commerce is rich but semi-structured, containing textual information about products and structured interaction sequences. Existing methods often overlook this complexity. The proposed UBM model addresses these challenges by pre-training on large-scale session data with contrastive learning objectives.
The two-stage pre-training scheme encourages self-learning from various augmentations with contrastive learning objectives at different granularity levels of session data. This approach enables the model to scrutinize subtle clues inside sessions for better user behavior understanding.
UBM outperforms general-domain language models and e-commerce pre-trained models across all downstream tasks. The results highlight the effectiveness of leveraging both textual information and interaction sequences for deep session data understanding in e-commerce.
To Another Language
from source content
arxiv.org
Thông tin chi tiết chính được chắt lọc từ
by Zixuan Li,Li... lúc arxiv.org 03-06-2024
https://arxiv.org/pdf/2403.02825.pdfYêu cầu sâu hơn