As a wee experiment, I've produced a simple Haskell program that sends a prompt containing a lemma to prove in Agda to an LLM. The program checks the LLM-generated proof by invoking Agda, and if the proof contains an error then this is fed back to the LLM and it is asked to try again. This process repeats until a proof passes, or the user aborts (or the LLM reaches some rate limit). The LLMs that I tested were the basic (non-reasoning) models from Claude (e.g. Sonnet 3.7), Llama3 and Llama 4 variants, a version of DeepSeek, a version of Qwen 32B, and the o3 reasoning model from ChatGPT (plus a few others). The experiment here has no human in the loop, the LLM has to produce a totally syntactically correct Agda program which needs to be typed-checked by Agda without any interactive editing or "fix up". The goal here is not to produce a nice proof like one a human might write, but instead to
This is a debugging challenge. | |
Read the document below, and identify the bug. | |
--- | |
# The Interaction Calculus | |
The Interaction Calculus (IC) is term rewriting system inspired by the Lambda | |
Calculus (λC), but with some major differences: | |
1. Vars are affine: they can only occur up to one time. |
I am investigating how to use Bend (a parallel language) to accelerate Symbolic AI; in special, Discrete Program Search. Basically, think of it as an alternative to LLMs, GPTs, NNs, that is also capable of generating code, but by entirely different means. This kind of approach was never scaled with mass compute before - it wasn't possible! - but Bend changes this. So, my idea was to do it, and see where it goes.
Now, while I was implementing some candidate algorithms on Bend, I realized that, rather than mass parallelism, I could use an entirely different mechanism to speed things up: SUP Nodes. Basically, it is a feature that Bend inherited from its underlying model ("Interaction Combinators") that, in simple terms, allows us to combine multiple functions into a single superposed one, and apply them all to an argument "at the same time". In short, it allows us to call N functions at a fraction of the expected cost. Or, in simple terms: why parallelize when we can share?
A
This is the predecessor of Mazeppa.
Supercompilation is a deep program transformation technique due to V. F. Turchin, a prominent computer scientist, cybernetician, physicist, and Soviet dissident. He described the concept as follows [^supercompiler-concept]:
A supercompiler is a program transformer of a certain type. The usual way of thinking about program transformation is in terms of some set of rules which preserve the functional meaning of the program, and a step-by-step application of these rules to the initial program. ... The concept of a supercompiler is a product of cybernetic thinking. A program is seen as a machine. To make sense of it, one must observe its operation. So a supercompiler does not transform the program by steps; it controls and observes (SUPERvises) the running of the machine that is represented by th
using System; | |
using System.Collections.Generic; | |
using System.Linq; | |
using System.Text; | |
using System.Threading.Tasks; | |
using System.Windows; | |
using System.Windows.Media; | |
using System.Windows.Media.Animation; | |
namespace WPFAnimations |
mkdir Kata | |
cd Kata | |
dotnet new sln | |
mkdir App | |
cd App | |
dotnet new classlib -lang F# |
type Term = | |
| Var of int | |
| Pair of Term * Term | |
| Int of int | |
let (|Find|_|) s k = Map.tryFind k s | |
let rec walk (s: Map<_,_>) = function | |
| Var(Find s t) -> walk s t | |
| t -> t |
open System | |
open System.IO | |
type State<'s, 'a> = State of ('s -> ('a * 's)) | |
module State = | |
let inline run state x = let (State(f)) = x in f state | |
let get = State(fun s -> s, s) | |
let put newState = State(fun _ -> (), newState) | |
let map f s = State(fun (state: 's) -> |
/** | |
* The <code>StreamTokenizer</code> class takes an input stream and | |
* parses it into "tokens", allowing the tokens to be | |
* Read one at a time. The parsing process is controlled by a table | |
* and a number of flags that can be set to various states. The | |
* stream tokenizer can recognize identifiers, numbers, quoted | |
* strings, and various comment styles. | |
* <p> | |
* Each byte Read from the input stream is regarded as a character | |
* in the range <code>'\u0000'</code> through <code>'\u00FF'</code>. |
open System | |
type Cont<'T> = | |
abstract Call<'R> : ('T -> 'R) * (exn -> 'R) -> 'R | |
let protect f x cont econt = | |
let res = try Choice1Of2 (f x) with err -> Choice2Of2 err | |
match res with | |
| Choice1Of2 v -> cont v | |
| Choice2Of2 v -> econt v |