Core Concepts
Circuit Transformer explores using large language models to predict logic gates in circuit design.
Abstract
The abstract introduces the concept of using large language models for understanding circuits.
Two barriers are identified: complex structure and equivalence constraints in circuits.
The proposed solution involves encoding circuits as memory-less trajectories and ensuring equivalence preservation during decoding.
The Circuit Transformer model is trained with impressive results in end-to-end logic synthesis.
CCS Concepts include Electronic design automation and Artificial intelligence methodologies.
The introduction compares language models' success to mastering human languages with the potential for circuit design.
Differences between languages and circuits pose challenges for existing models due to structural variations and precision requirements.
A generative approach is proposed, defining a circuit model that predicts the next gate based on context and structural information.
Equivalence-preserving decoding ensures adherence to specified constraints during circuit generation.
Circuit generation is viewed as a sequential decision-making process, allowing optimization-oriented tasks like logic synthesis.
Stats
"Experimentally, we trained a Transformer-based model of 88M parameters."
"The average number of AND gates are 24.10 for the original AIGs and 9.57 for the synthesized AIGs."