Core Concepts
Challenging the traditional two backpropagation strategy in recommendation models with a novel one backpropagation approach for improved performance.
Abstract
The content discusses the challenges in traditional two tower recommendation models and introduces a new approach called One Backpropagation. It challenges the equal treatment assumption of users and items in model training and proposes a moving-aggregation strategy for user encoding updates. The paper outlines the structure of two tower recommendation models, focusing on user-item encoding, negative sampling, loss computing, and backpropagation updating. Experiments on public datasets validate the effectiveness and efficiency of the One Backpropagation model.
Stats
"Experiments on four public datasets validate the effectiveness and efficiency of our model."
"Results indicate better recommendation performance of our OneBP than that of peer algorithms."
Quotes
"We propose a moving-aggregation updating strategy to update a user encoding in each training epoch."
"Our OneBP outperforms all kinds of the state-of-the-art competitors."