The content discusses the problem of online model selection (OMS) with decentralized data (OMS-DecD) over multiple clients. The key insights are:
Collaboration is unnecessary if the computational cost on each client is allowed to be O(K), where K is the number of candidate hypothesis spaces. In this case, a non-cooperative algorithm that independently runs OMS on each client can achieve similar performance as a federated algorithm.
Collaboration is necessary if the computational cost on each client is limited to o(K). In this case, a federated algorithm can outperform the non-cooperative approach by leveraging collaboration among clients.
The authors propose two federated algorithms, FOMD-OMS (R = T) and FOMD-OMS (R < T), that balance the trade-off between prediction performance and communication cost. FOMD-OMS (R = T) achieves the optimal regret bound without communication constraint, while FOMD-OMS (R < T) explicitly controls the communication rounds.
As a byproduct, the authors improve the regret bounds of existing algorithms for distributed online multi-kernel learning (OMKL) at a smaller computational and communication cost.
إلى لغة أخرى
من محتوى المصدر
arxiv.org
الرؤى الأساسية المستخلصة من
by Junfan Li,Ze... في arxiv.org 04-16-2024
https://arxiv.org/pdf/2404.09494.pdfاستفسارات أعمق