Guiding Function Synthesis with Machine Learning

Julian Parsert | Start: 11:00 | IAIK small seminar room (IFEG052), Inffeldgasse 16a
Syntax-Guided Synthesis describes a format for function synthesis problems within a first-order theory. In addition to conventional semantic constraints stated in a first-order theory, SyGuS also allows for the posing of syntactic constraints through the use of a grammar. Most SyGuS solvers can be classified as enumerative solvers that systematically search the space through the space of functions that adhere to the syntactic constraint. However, these strategies suffer from the combinatorial explosion of the solution space.
In this talk I will present machine learning based methods to guide a search through this large search space. In the first part of this talk I will present how we utilize Monte-Carlo tree search with reinforcement learning. This method is inspired by AlphaZero. In the second part, I will present our most recent work on using LLMs to guide the search. We use LLMs to calculate probabilities for the production rules of the grammar and subsequently generate functions according to those probabilities. We evaluate and compare two different prompting methods to obtain the probabilities.

I recently finished my PhD at the University of Oxford under the supervision of Daniel Kroening and Tom Melham with a thesis titled "Machine Learning for Function Synthesis". Generally, I am interested in automated reasoning or computer-aided verification of mathematical proofs, software, and hardware (in no particular order). Apart from my PhD I worked at the University of Edinburgh and the University of Innsbruck as a research associate where I also worked on computational logic, theorem proving, and term rewriting. I lived in Innsbruck, Austria most of my life and also obtained my Undergrad and Masters at the University of Innsbruck.

Photo provided by speaker