The article proposes L2GDV, a novel federated learning algorithm utilizing a varying step size in stochastic gradient descent, to efficiently optimize regularized empirical risk minimization problems while reducing communication costs.
Local Superior Soups (LSS) is a novel method that leverages model interpolation and regularization techniques to enhance the efficiency of pre-trained models in federated learning, particularly in scenarios with data heterogeneity and limited communication rounds.
Direct weight aggregation, particularly when combined with the GaLore optimizer for local training, outperforms LoRA-based methods in federated fine-tuning of large language and vision transformer models, offering superior performance, stability, and generalization capabilities.
ProgFed is a novel federated learning framework that reduces communication and computation costs by progressively training increasingly complex models, achieving comparable or superior performance to traditional methods.
FedAda2, a novel approach to federated learning, achieves efficient joint server- and client-side adaptive optimization by initializing local preconditioners from zero and employing memory-efficient client optimizers, thereby mitigating communication bottlenecks and client resource constraints without sacrificing performance.
Leveraging mixed-precision quantization and over-the-air aggregation in federated learning significantly improves both performance and energy efficiency, particularly in resource-constrained edge computing environments.
Partial network updates in federated learning, as opposed to full network updates, can mitigate layer mismatch issues, leading to faster convergence, improved accuracy, and reduced communication and computational overhead.
This paper proposes a novel online control scheme for client scheduling and resource allocation in Federated Learning (FL) over mobile edge networks, aiming to minimize training latency and enhance model accuracy under resource constraints and uncertainty.
This paper presents a compute-optimized implementation of the FedNL algorithm family for federated learning, demonstrating significant speedups over the original implementation and existing solutions for logistic regression.
The article introduces OTA Fed-Sophia, a novel second-order federated learning algorithm that leverages sparse Hessian estimation and over-the-air aggregation to achieve faster convergence with reduced communication costs and enhanced privacy for large-scale models.