The paper introduces KirchhoffNet, a novel class of neural network models that leverages Kirchhoff's current law (KCL) from analog electronic circuitry. Key highlights:
KirchhoffNet is defined as a directed graph where nodes represent voltages and edges represent learnable non-linear current-voltage relations. The network dynamics are governed by KCL, which states that the sum of currents flowing into a node equals the sum of currents flowing out.
KirchhoffNet has close connections to continuous-depth models and message passing neural networks. The authors show that the adjoint method can be applied to efficiently train KirchhoffNet.
A key advantage of KirchhoffNet is its potential for hardware implementation. The authors justify that a KirchhoffNet can be physically realized using an analog electronic circuit, and its forward calculation can always be completed within 1/f seconds, where f is the hardware's clock frequency. This enables ultra-fast inference regardless of the number of parameters.
The authors design a KirchhoffNet architecture for the MNIST image classification task and achieve 98.86% test accuracy, comparable to state-of-the-art results, without using traditional neural network layers like convolution or linear layers.
Overall, the paper presents a novel neural network paradigm that bridges the gap between analog circuit theory and deep learning, opening up new possibilities for efficient and ultra-fast neural network hardware.
To Another Language
from source content
arxiv.org
Ключові висновки, отримані з
by Zhengqi Gao,... о arxiv.org 05-07-2024
https://arxiv.org/pdf/2310.15872.pdfГлибші Запити