Core Concepts
Contrastive learning can improve the performance of federated recommender systems, but also makes them more vulnerable to model poisoning attacks. A popularity-based contrastive regularizer is proposed to maintain the distance between item embeddings of different popularity levels, enhancing both the recommendation effectiveness and robustness of federated recommender systems.
Abstract
The paper introduces a contrastive learning framework tailored for federated recommender systems, called CL4FedRec. CL4FedRec constructs positive and negative user samples without compromising user privacy, and augments item views based on the updated user representations.
Experiments show that CL4FedRec can significantly improve the recommendation performance of federated recommender systems. However, the authors find that contrastive learning also exacerbates the vulnerability of federated recommender systems to model poisoning attacks. This is attributed to the enhanced uniformity of the embedding distribution, which makes it easier for adversaries to manipulate target item embeddings to mimic popular items.
To address this issue, the authors propose an enhanced and robust version of CL4FedRec, called rCL4FedRec, by introducing a popularity-based contrastive regularizer. This regularizer maintains the distance between item embeddings of different popularity levels, preventing adversaries from easily boosting the exposure of target items. Extensive experiments demonstrate that rCL4FedRec can significantly improve both the recommendation effectiveness and robustness of federated recommender systems against state-of-the-art model poisoning attacks.
Stats
The MovieLens-1M dataset contains 1,000,208 interaction records between 6,040 users and 3,706 movies.
The Amazon-Phone dataset has 13,174 users, 5,970 cell phones, and 103,593 feedbacks.
The Amazon-Video dataset contains 63,836 interactions involving 8,072 users and 11,830 items.
The QB-Article dataset includes 10,981 users, 6,493 articles, and 335,663 reading records.