Graph Sequence Learning for Premise Selection

03/27/2023
by   Edvard K. Holden, et al.
0

Premise selection is crucial for large theory reasoning as the sheer size of the problems quickly leads to resource starvation. This paper proposes a premise selection approach inspired by the domain of image captioning, where language models automatically generate a suitable caption for a given image. Likewise, we attempt to generate the sequence of axioms required to construct the proof of a given problem. This is achieved by combining a pre-trained graph neural network with a language model. We evaluated different configurations of our method and experience a 17.7

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset