Overcoming Memory Constraints for Heterogeneous Federated Learning through Progressive Training
ProFL, a novel progressive training framework, effectively breaks the memory wall of federated learning by gradually training the model in a step-wise manner, enabling participation of memory-constrained devices and achieving superior model performance.