The paper discusses the problem of distributed learning (DL) in the presence of stragglers and proposes a novel DL method based on 1-bit gradient coding (1-bit GC-DL) to reduce communication burden. The method distributes training data redundantly, computes gradients locally, quantizes them into 1-bit vectors, and transmits them to peers. Theoretical convergence guarantees are provided for both convex and non-convex loss functions. Empirical results show improved performance compared to baseline methods.
Sang ngôn ngữ khác
từ nội dung nguồn
arxiv.org
Thông tin chi tiết chính được chắt lọc từ
by Chengxi Li,M... lúc arxiv.org 03-25-2024
https://arxiv.org/pdf/2403.14716.pdfYêu cầu sâu hơn