The author argues that by leveraging social connections, a novel Social-aware Clustered Federated Learning scheme can enhance model utility without sacrificing privacy. This approach strikes a balance between data privacy and efficiency in federated learning.
The author proposes a coupled tensor train decomposition approach for privacy-preserving federated learning, achieving data confidentiality and computational efficiency.
The authors aim to achieve near-optimal utility in privacy-preserving federated learning through data generation and parameter distortion.
Proposing a novel approach, Coupled Tensor Train Decomposition (CTT), for privacy-preserving federated learning networks.
ALI-DPFL algorithm improves performance in resource-constrained scenarios through adaptive local iterations.
Proposing AerisAI for secure decentralized AI collaboration with differential privacy and homomorphic encryption.
Collaborative federated learning protocols must balance privacy guarantees and model accuracy to be mutually beneficial for all participants.
Federated learning and differential privacy can be combined to enable large-scale machine learning over distributed datasets while providing rigorous privacy guarantees.
A privacy-preserving federated learning framework is proposed that uses random coding and system immersion tools to protect the privacy of local and global models without compromising model performance or system efficiency.
Upcycled-FL, a novel federated learning strategy that applies first-order approximation at every even round of model update, can significantly reduce information leakage and computational cost while maintaining model performance.