Relational Inductive Biases Enable Efficient Learning of Abstract Concepts in Neural Networks
The relational bottleneck is an architectural inductive bias that enables neural networks to learn abstract relational concepts in a data-efficient manner, by constraining information processing to focus on relations between objects rather than the attributes of individual objects.