Hertrich, C., & Loho, G. (2024). Neural Networks and (Virtual) Extended Formulations. arXiv preprint arXiv:2411.03006.
This paper investigates the relationship between the size of neural networks and the complexity of representing polytopes, aiming to leverage the well-established theory of extended formulations to derive lower bounds on neural network size.
The authors introduce the concept of "virtual extension complexity," which captures the complexity of representing a polytope as a Minkowski difference of two other polytopes with known extension complexities. They then establish a connection between virtual extension complexity and the size of maxout neural networks, particularly focusing on monotone networks.
The study establishes a novel link between neural network size and the complexity of representing polytopes through virtual extended formulations. This connection opens up avenues for potentially deriving stronger lower bounds on neural network size by leveraging existing results on extension complexity.
This research significantly contributes to the theoretical understanding of neural networks by connecting their expressive power to the well-studied field of polyhedral combinatorics. It provides a new perspective on analyzing neural network size and complexity.
The main open question is whether virtual extension complexity can be significantly smaller than ordinary extension complexity. Further research should explore methods for proving lower bounds on virtual extension complexity, potentially leading to stronger lower bounds on neural network size. Additionally, investigating the practical implications of virtual extended formulations for optimizing over polytopes is a promising direction.
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Christoph He... at arxiv.org 11-06-2024
https://arxiv.org/pdf/2411.03006.pdfDeeper Inquiries