Federated Learning with Differentially Private Loss Approximations
FedLAP-DP proposes a novel approach to federated learning that approximates local loss landscapes using synthetic samples, enabling unbiased global optimization on the server side. This method outperforms traditional gradient-sharing schemes, especially under tight privacy budgets and highly skewed data distributions.