The paper discusses the problem of distributed learning (DL) in the presence of stragglers and proposes a novel DL method based on 1-bit gradient coding (1-bit GC-DL) to reduce communication burden. The method distributes training data redundantly, computes gradients locally, quantizes them into 1-bit vectors, and transmits them to peers. Theoretical convergence guarantees are provided for both convex and non-convex loss functions. Empirical results show improved performance compared to baseline methods.
Ke Bahasa Lain
dari konten sumber
arxiv.org
Wawasan Utama Disaring Dari
by Chengxi Li,M... pada arxiv.org 03-25-2024
https://arxiv.org/pdf/2403.14716.pdfPertanyaan yang Lebih Dalam