Neural Circuit Synthesis with Pre-trained Language Models

Frederik Schmitt, Matthias Cosler and Bernd Finkbeiner

This extended abstract reports preliminary results on fine-tuning pre-trained language models for solving reactive synthesis problems end-to-end. In recent work, hierarchical Transformer neural networks have been successfully trained from scratch to synthesize sequential circuits directly out of formal specifications. We improve over existing approaches by fine-tuning CodeT5 models that have been pre-trained on both natural language and programming languages. Our experiments show improved generalization and sample efficiency compared to the previous approach.

First International Workshop on Deep Learning Aided Verification July 2023
Contact Data Privacy Policy Imprint
Home People Publications
More