This is not a culinary but a logical exploration. There is continuous interest on the Curry-Howard correspondence not only from logicians but also from practicioners of typeful programming. I link just to the last two to appear on Reddit; as Baader-Meinhof would have it, I was reading the entry about the development of intuitionistic logic on the Stanford Encyclopedia of Philosophy when it occurred to me that the different axiomatizations presented must have immediate corresponding λ-terms.
The Curry-Howard correspondence states that if intuitionistic implication is interpreted as a typed λ-abstraction, then types represent constants, polymorphic type variables represent logical variables, and each step in a logical deduction corresponds to a β-reduction step. The consequence of this is Wadler's Theorems for Free!, that basically states that given a polymorphic-typed term, its shape is so constrained by its types that it is, essentially, unique, and thus it is not necessary to actually carry out the proof, or the reduction, but just checking that the types match suffices.
Not quite. Negation in IL is a though concept to systematize, because the intuition (!) behind a proof is that it is a process of deriving the consequent from the given antecedents, and negation would seem to require showing that there is no way to "build" the thing being negated. This is codified in the usual way as ¬A ≡ A → ⊥, where the latter denotes "false", or "undefined", which is interpreted under the isomorphism as the empty type empty
. This type is not usually present in Hindley-Milner languages like OCaml because it introduces more problems than it solves, so there is not really a direct way to "program" negation in OCaml. The indirect route is to use first-class continuations; these are present in Standard ML of New Jersey, but again not in OCaml. I had to fake it somehow.
Obviously, writing in continuation-passing style makes using call-with-current-continuation
(or callcc
, as it is affectionately called), which I could use to implement negation, trivial; the problem with that is that I had to encode all my axioms using CPS. Leroy shows how a monad can make this painless, by structuring the CPS semantics inside one:
module Cont : sig type empty type 'a cont = 'a -> empty type 'a t = 'a cont cont val return : 'a -> 'a t val bind : 'a t -> ('a -> 'b t) -> 'b t val callcc : ('a cont -> 'a t) -> 'a t val throw : 'a cont -> 'a -> 'b t val run : 'a t -> 'a end = (* …
The empty type I can fake it using an abstract type; this is actually an existential type that makes the type α cont
of continuations mean "a continuation is a function from things of arbitrary type to some type that I know but I won't tell you what is". The type of the monad α t
is a continuation of continuations; this is forced by the fact that, under the CH correspondence, the intuitionistic axiom ¬¬A → A is precisely translation to CPS. With that, return
and bind
have the expected types. These allow me to write callcc
, that invokes a function with the current continuation, and throw
, that aborts the current continuation with the one supplied to it. In order to "escape" the monad, run
unpacks the computation and gives its final value, if at all possible.
The implementation follows closely Leroy's, except for run
that uses a reference to achieve fully polymorphic continuations. This uses Oleg Kiselyov's insight about delimited continuations, namely that mutation is implicitly a delimited continuation, which has a recursive type
:
… *) struct type empty = unit type 'a cont = 'a -> empty type 'a t = 'a cont cont let return x = fun k -> k x let bind m f = fun k -> m (fun x -> f x k) let callcc f = fun k -> f k k let throw k x = fun _ -> k x let run m = let r = ref None in let () = m (fun v -> r := Some v) in match !r with Some v -> v | None -> failwith "throw" end
Note that my empty
is as simple as it could be short of being really the empty type. Then return
and bind
exactly implement the CPS-transform; I use anonymous functions in the bodies to emphasize the continuation argument. This makes callcc
's working transparent: it feeds the (now explicit) continuation to its argument; in the same way, it is obvious that throw
works by discarding the continuation and invoking immediately its argument.
The case of run
is subtle but not especially complicated: it feeds its CPS-transformed argument a continuation that stores the result in a reference; this reference must be initialized to something not yet existing and so it has type α option
. When the continuation returns, a pattern-match is used to recover the resulting value. Except that if there isn't a resulting value, because the CPS function aborted to a top-level continuation instead of running through to the end, the effect is equivalent to an unhandled exception.
The first axiomatization given in the SEP is Kolmogorov's; I won't show it here because it is the minimal set of axioms required to show an embedding of classical logic in intuitionistic logic. The second axiomatization is Glivenko's:
- p → p
- (p → q) → ((q → r) → (p → r))
- (p ∧ q) → p
- (p ∧ q) → q
- (r → p) → ((r → q) → (r → (p ∧ q)))
- p → (p ∨ q)
- q → (p ∨ q)
- (p → r) → ((q → r) → ((p ∨ q) → r))
- (p → q) → ((p → ¬q) → ¬p)
According to the CH correspondence, the product type (that is, the type of pairs) encodes conjunction and the disjunction maps to a sum type, otherwise standard in Haskell:
type ('a, 'b) either = Left of 'a | Right of 'b
The first five axioms are straightforward; they almost write themselves, even with names that are all but forced by their shapes:
module G = struct let id x = x let seq f g x = g (f x) let fst (x, _) = x let snd (_, y) = y let pair f g x = (f x, g x)
The following three are no more difficult:
let left x = Left x let right x = Right x let cond f g = function Left x -> f x | Right y -> g y
The last term completes the axiomatization with a sort of rule of contradiction: if from the same antecedent it can be proved both a consequent and its negation, the antecedent is false. The problem while encoding this term is that there isn't an obvious way to magic in the particular proposition involved. The idea is to invoke the continuation of the term, hoping that it somehow provides the value required, and apply it to the first function. If it however raises an exeption, the value is fed to the second function, which functions as a kind of handler:
let catch handle raise = Cont.run (Cont.callcc (fun k -> let v = Cont.run k in Cont.throw (raise v) (handle v))) end
The module signature encodes the axioms:
module G : sig val id : 'a -> 'a val seq : ('a -> 'b) -> ('b -> 'c) -> 'a -> 'c val fst : 'a * 'b -> 'a val snd : 'a * 'b -> 'b val pair : ('a -> 'b) -> ('a -> 'c) -> 'a -> 'b * 'c val left : 'a -> ('a, 'b) either val right : 'a -> ('b, 'a) either val cond : ('a -> 'b) -> ('c -> 'b) -> ('a, 'c) either -> 'b val catch : ('a -> 'b) -> ('a -> 'b Cont.cont) -> 'a Cont.cont end
where pairs must be read as conjunctions, and occurrences of the either
type denote disjunctions. The third axiomatization given is Heyting's:
- p → (p ∧ p)
- (p ∧ q) → (q ∧ p)
- (p → q) → ((p ∧ r)→ (q ∧ r))
- ((p → q) ∧ (q → r)) → (p → r)
- q → (p → q)
- (p ∧ (p → q)) → q
- p → (p ∨ q)
- (p ∨ q) → (q ∨ p)
- ((p → r) ∧ (q → r)) → ((p ∨ q) → r)
- ¬p → (p → q)
- ((p → q) ∧ (p → ¬q)) → ¬p
These axioms are both more basic and harder to work with; for instance, there is no axiom that directly allows proving a proposition from a conjunction:
module H = struct let diag x = (x, x) let swap (x, y) = (y, x) let first f (x, y) = (f x, y) let seq (f, g) x = g (f x) let const x _ = x let apply (x, f) = f x let left x = Left x let exch = function Left x -> Right x | Right y -> Left y let cond (f, g) = function Left x -> f x | Right y -> g y
Now, the axioms dealing with negation can be resolved directly with the continuation monad, as I did before. Axiom number 10 has exactly the type of throw
, except for the monadic type. It corresponds to the ex falso quod libet principle (in Buenos Aires we have a saying that basically goes "if my granny had wheels, I would be a cart" which is as silly as it is true). Axiom number 11 is exactly as before:
let throw k x = Cont.run (Cont.throw k x) let catch (raise, handle) = Cont.run (Cont.callcc (fun k -> let v = Cont.run k in Cont.throw (handle v) (raise v))) end
Again, the module's signature matches the axioms:
module H : sig val diag : 'a -> 'a * 'a val swap : 'a * 'b -> 'b * 'a val first : ('a -> 'b) -> 'a * 'c -> 'b * 'c val seq : ('a -> 'b) * ('b -> 'c) -> 'a -> 'c val const : 'a -> 'b -> 'a val apply : 'a * ('a -> 'b) -> 'b val left : 'a -> ('a, 'b) either val exch : ('a, 'b) either -> ('b, 'a) either val cond : ('a -> 'b) * ('c -> 'b) -> ('a, 'c) either -> 'b val throw : 'a Cont.cont -> 'a -> 'b val catch : ('a -> 'b) * ('a -> 'b Cont.cont) -> 'a Cont.cont end
Now strictly speaking both axiomatizations ought to be equivalent; in other words, one's axioms are theorems in the other; and so I could very well expect to find expressions for H
's functions solely in terms of G
's. However, things are not that easy. By the CH-correspondence, application of λ-terms corresponds to modus ponens; however, a proof is free, in order to complete a given step, to recall any axiom or prior logical derivation whatsoever; in effect, it requires binding as a metalogical operation. Should I insist with equating proofs with combinators, I must introduce one or more structural axioms that allow me to simulate abstraction and application; in effect, I'd be using SK calculus. An intermediate possibility would be to allow naming the arguments but restricting the use of composition and application of the base system's functions to construct the corresponding functions of the derived system:
module H' = struct let diag x = G.pair G.id G.id x let swap x = G.pair G.snd G.fst x let first f = G.pair (G.seq G.fst f) G.snd let seq p = G.seq (G.fst p) (G.snd p) let const x _ = x let apply p = (G.snd p) (G.fst p) let left = G.left let exch d = G.cond G.right G.left d let cond p = G.cond (G.fst p) (G.snd p) let throw k x = Cont.run (Cont.throw k x) let catch p = G.catch (G.fst p) (G.snd p) end
Note that diag
, swap
, left
and exch
are combinators, and the argument is forced by OCaml's value restriction. Note also that const
in Heyting's formulation is purely structural and identical to the K combinator.
With respect to seq
, apply
, cond
and catch
, they name their argument only to extract both components of the pair, in effect uncurrying the corresponding axioms in Glivenko's formulation. With an explicit metalogic axiom (a → (b → c)) → ((a ∧ b) → c):
let uncurry f (x, y) = f x y
more theorems are actually combinators:
module H'' = struct let diag x = G.pair G.id G.id x let swap x = G.pair G.snd G.fst x let second f = G.seq (G.seq G.snd) (G.pair G.fst) f let first' f = G.seq (G.seq G.fst) (G.pair G.snd) f let first f = G.seq (G.seq G.fst) (fun h -> G.pair h G.snd) f let seq p = uncurry G.seq p let const x _ = x let apply p = G.seq swap (uncurry G.id) p let left = G.left let exch d = G.cond G.right G.left d let cond p = uncurry G.cond p let throw k x = Cont.run (Cont.throw k x) let catch p = uncurry G.catch p end
The importance of theorems being implemented as combinators is that the proofs are structurally very simple, with the outermost redex the only choice point where to apply modus ponens. As it is evident, by avoiding naming the arguments the only option left for pair
is to fix its first, making the point-free expression of second
easy but tht of first
impossible under this restriction; the best I could manage was first'
, where the resulting tuple is in reverse order. Also, the theorem for apply
shows that it is indeed uncurry
under another guise; maybe this is a heuristic argument that points to the fact that uncurry
could be expressed under the axioms G
.
Going the other way would need its counterpart, the axiom ((a ∧ b) → c) → (a → (b → c)), or currying:
let curry f x y = f (x, y)
In effect, this pair of functions witness the isomorphism α × β → γ ≅ α → β → γ, where I switch from the logical to the typed view to emphasize the fact that this is not just an equivalence (a fact easily proved in classical logic with the identity a ⇒ b ≡ ¬a ∨ b and de Morgan's Law).
module G' = struct let id x = H.seq (H.diag, uncurry H.const) x let seq f = curry H.seq f let fst p = uncurry H.const p let snd p = H.seq (H.swap, uncurry H.const) p let pair f g = H.seq (H.first f, H''.second g) let left = H.left let right x = H.seq (H.left, H.exch) x let cond f = curry H.cond f let catch f = curry H.catch f end
I crucially needed both named capture and the auxiliary lemma H''.second
in the definition of pair
; I won't even try to define it in combinator form.
So as you can see, both axiomatizations are, indeed, interexpressible; a purely logical proof would simply need to simulate reduction of the corresponding terms, although this is not necessary, since as a consequence of freeness it only suffices to check that the types come out right.
As a final theorem for free, let me show you the term that proves the law of noncontradiction:
let fail x = G.catch G.fst G.snd x
(and indeed it does fail, as entering G.catch G.fst G.snd
in the console shows) whose type (α × α cont
) cont
corresponds to the theorem ¬(a ∧ ¬a).
1 comment:
I took me quite a while to get it, but i think there is something more in that implementation of the continuation-"run".
You describe (¬¬A → A) as an "intuitionistic axiom". The point is (or maybe there is a misunderstanding on my part here) that is *not* an intuitionistic axiom : if you admit that ¬¬A → A on top of the intuistionistic logic, you get the classical logic (because this axiom is equivalent to the law of excluded middle).
This means that OCaml, being impure, has the expressive power to implement "run" inside the language : we can express the full classical logic using only well-typed, "nice" (no cheaty recursion or what not) OCaml terms : it must actually be enough to use only "pure" terms, plus CPS, plus the "run" operation.
But if the "run" operation can be implemented in bare OCaml, why do we need such mystery around the "empty" type, why make it abstract ? This is probably because if the "user" (the one using the CPS library, wich isn't supposed to know about "empty" implementation) could create values of type "empty", he could prove false, and the whole thing would probably break (side question : were is the "false implies anything" axiom here ?).
So we have two things : OCaml is expressive enough to express with well-type terms both intuitionistic and classical logic, but we must still hide the implementation of the ¬¬A → A embedding. The other solution, "first class continuation support in the compiler" also hides the run operation.
I have the feeling that there are deeper questions in play here : is there an analogy between "hiding implementation details" and some metamathematical operations on the logic side ? Could other hiding/abstraction situations (or even the general principles of software engineering) be interpreted in that setting ?
Post a Comment