Core Concepts
Pruning-enabled Hierarchical Federated Learning optimizes convergence in wireless networks.
Abstract
The content discusses Hierarchical Federated Learning (HFL) in wireless networks, focusing on model pruning to address bandwidth scarcity and system heterogeneity. The paper proposes a Pruning-enabled Hierarchical Federated Learning (PHFL) framework that optimizes convergence rates by jointly configuring wireless resources and system parameters. Through theoretical analysis and simulations, the effectiveness of PHFL is validated in terms of test accuracy, training time, energy consumption, and bandwidth requirements.
Structure:
Introduction to Federated Learning in Wireless Networks
Proposed Pruning-enabled Hierarchical Federated Learning Framework
Convergence Analysis and Optimization Strategies
Simulation Results and Validation
Highlights:
Practical constraints in wireless networks necessitate model pruning for efficient learning.
PHFL algorithm optimizes convergence by adjusting parameters under strict constraints.
Extensive simulations confirm PHFL's effectiveness across various metrics.
Stats
Owing to these practical constraints and system models, this paper leverages model pruning.
Through extensive simulation, we validate the effectiveness of our proposed PHFL algorithm.