Decoupled Vertical Federated Learning: Fault Tolerant and Secure Approach
Kernekoncepter
Decoupled VFL proposes a fault-tolerant and secure approach to vertically partitioned data training, ensuring privacy and graceful degradation under faults.
Resumé
Decoupled Vertical Federated Learning (DVFL) introduces a blockwise learning approach to address the limitations of Vertical Federated Learning (VFL). DVFL allows for decentralized aggregation and isolation between feature learning and label supervision. By training each model on its own objective, DVFL ensures fault tolerance and security. The system consists of guests who process features of entities in the same order, passing them to host models for aggregation. DVFL eliminates the single point of failure issue present in VFL by allowing hosts to continue training even with missing inputs from guests. The method also addresses privacy concerns by avoiding inference attacks through gradient feedback. Experimental results show that DVFL outperforms VFL in scenarios with limited sample intersections.
Decoupled Vertical Federated Learning for Practical Training on Vertically Partitioned Data
"By eschewing end-to-end backpropagation (BP) altogether? We present Decoupled Vertical Federated Learning (DVFL), a novel strategy for ANNs to address these shortcomings."
"We propose Decoupled VFL (DVFL), a blockwise learning approach to VFL."
"In this work, we attack the source directly."