Repairing Circuits with Transformers
Matthias Cosler
In this work, we present an approach for repairing faulty circuits using deep neural networks. Given a faulty circuit and a specification in LTL, we demonstrate that our approach repairs faulty circuits such that they satisfy corresponding specifications. Since the 1950s, when introduced by Alonzo Church, reactive synthesis is a funda- mental problem in computer science. Unfortunately, because the synthesis of temporal specifications is usually 2EXPTIME-complete, classic synthesis algorithms are slow and only able to synthesize from comparatively short and easy specifications. Using machine learning for such computationally hard problems gives new perspectives and recent advances show promising results. We improve these results by repairing mispredicted circuits using a Transformer-based model. We introduce a new architecture, the separated hierarchical Transformer, that is de- signed to handle circuits and specifications as input sources. We introduce multiple datasets that include specifications, faulty circuits, and circuits that satisfy the specifica- tions. We train separated hierarchical Transformer models with these datasets to repair faulty circuits towards circuits that satisfy a given specification. We show that the pro- posed architecture can learn from the synthetic data, that the model utilizes the repair circuit to solve more complex specifications, and that the model generalizes to out-of- distribution datasets. We show that we improve the state-of-the-art in Neural Circuit Synthesis by 6.8 percentage points on held-out instances and 11.8 percentage points on an out-of-distribution dataset from the reactive synthesis competition SYNTCOMP.
Master Thesis.