$Ξ»$-Calculus & Curry-Howard correspondence

Younesse Kaddar

I. $Ξ»$-Calculus

  1. Definition
  2. $𝛼$, $πœ‚$-conversions and $𝛽$-reduction
  3. Alligators

II. Fixed-point combinator

  1. Turing-completeness of $πœ†$-calculus
  2. Curry and Turing combinators
  3. Curry-Howard correspondence

I.1 - $Ξ»$-Calculus : definition

Let $𝒱 \; ≝ \; \; \lbrace x, y, z, \ldots \rbrace$ be a fixed set of variables

$𝛬$ expressions :

$$Ξ› ::=V |Ξ› Ξ› |Ξ»VΒ·Ξ›$$
the smallest set of functions such that

  • $𝒱 \; βŠ† 𝛬$
  • Application : $βˆ€u, vβˆˆπ›¬, \underbrace{uv}_{\text{interpreted as } u(v)} ∈ 𝛬$
  • $πœ†$-abstraction : $βˆ€xβˆˆπ’±, uβˆˆπ›¬, \underbrace{πœ†x. u}_{\text{interpreted as } x \mapsto u} ∈ 𝛬$

NB : Such a set exists : apply Knaster-Tarski to the following monotonic function in the complete lattice $𝒫(𝛴^\ast)$, where $𝛴 \; ≝ \; \; \lbrace πœ†, . \rbrace βˆͺ 𝒱$

$$\begin{cases} 𝒫(𝛴^\ast) ⟢ 𝒫(𝛴^\ast) \\ 𝛬 \mapsto \lbrace uv \rbrace_{u,vβˆˆπ›¬} βˆͺ \lbrace πœ†x. u \rbrace_{\substack{xβˆˆπ’± \\ uβˆˆπ›¬}} βˆͺ 𝒱\end{cases}$$

I.2 - $𝛼$ / $πœ‚$-conversions, $𝛽$-reduction

$𝛼$-conversion :
$$\lambda x. u= _\alpha \lambda y. (\underbrace{u[x :=y]}_{\rlap{\text{all instances of $x$ in $u$ are replaced by $y$}}})$$

/!\ $x=y$ OR $x, yβˆ‰ bv(u)$ and $yβˆ‰ fv(u)$, as illustrated by these examples :

$(Ξ»xΒ·y)[x:=uv] β‰ _𝛼 Ξ»uvΒ·y$ $(Ξ»xΒ·x)[x:=y] β‰ _𝛼 Ξ»xΒ·y$
$(Ξ»yΒ·x)[x:=y] β‰ _𝛼 Ξ»yΒ·y$ $Ξ»xΒ·y[y:=x] β‰ _𝛼 Ξ»xΒ·x$

$𝛽$-reduction :
$$( \lambda x . u ) v \to u [ x := v ]$$

$πœ‚$-conversion :
$$\underbrace{\lambda x . (u x) =_πœ‚ u}_{\rlap{\text{If $x$ does not appear free in $u$}}}$$

Or… in terms of alligators (an idea by Victor Bret)




Alligator family


Eating process ≑ $𝛽$-reduction




⇓

⇓

⇓

⇓

⇓

$𝛽$-reduction

⇓

⇓

$𝛼$-conversion

⇓

⇓

old alligators ≑ parentheses

⇓

⇓

II.1 - Turing-completeness of $πœ†$-calclus

Church Numerals :

$$βˆ€nβˆˆβ„•, \; \lceil n \rceil \; ≝ \; \; πœ† f x. f^n(x) \; ⟹ \; β„• ≃ \left\lbrace πœ† f x. f^n(x) \right\rbrace$$

  • Constant function :

    $$βˆ€n, kβˆˆβ„•, \; f(x_1,\ldots,x_k) = n \\≑ πœ† x_1 ,\ldots, x_k. \lceil n \rceil$$

  • Successor function S :

    $$S(x) \stackrel{\mathrm{def}}{=} x + 1 \\≑ πœ† n. f(n f x)$$

  • Projection function :

    $$P_i^k(x_1,\ldots,x_k) \stackrel{\mathrm{def}}{=} x_i \\≑ πœ† x_1 ,\ldots, x_k. \, x_i$$

II.1 - Turing-completeness of $πœ†$-calculus

Operators:

  • Composition operator : $πœ†$-abstraction

Now, what about :

  • the primitive recursion operator $\rho$ :

$$\begin{align} \rho(g, h) &\stackrel{\mathrm{def}}{=} f \quad\text{where}\\ f(0,x_1,\ldots,x_k) &= g(x_1,\ldots,x_k) \\ f(y+1,x_1,\ldots,x_k) &= h(y,f(y,x_1,\ldots,x_k),x_1,\ldots,x_k)\,.\end{align}$$

  • the minimisation operator $\mu$ :

$$\begin{align} \mu(f)(x_1, \ldots, x_k) = z \stackrel{\mathrm{def}}{\iff}\ f(z, x_1, \ldots, x_k)&=0\quad \text{and}\\ f(i, x_1, \ldots, x_k)&>0 \quad \text{for}\quad i=0, \ldots, z-1.\end{align}$$

?

II.1 - Turing-completeness of $πœ†$-calculus

If

$$\mathbf{TRUE} \; ≝ \; \; Ξ»x.Ξ»y.x \\\mathbf{FALSE} \; ≝ \; \; Ξ»x.Ξ»y.y$$

then :

$$\mathbf{AND} \; ≝ \; \; Ξ»p.Ξ»q.p q p \\\mathbf{OR} ≝ Ξ»p.Ξ»q.p\; p \; q \\\mathbf{NOT} \; ≝ \; \; Ξ»p.p \; \mathbf{FALSE} \; \mathbf{TRUE} \\\mathbf{IFTHENELSE} \; ≝ \; \; Ξ»p.Ξ»a.Ξ»b.p \; a \; b \\\mathbf{ISZERO} \; ≝ \; \; Ξ»n. \; n \; (Ξ»x.\, \mathbf{FALSE}) \; \mathbf{TRUE}$$



But… what about recursion ?

II.2 - Fixed-point combinators

Astounding fact : each $πœ†$-term actually has a fixed point !

and there even exist fixed-point combinators $y$ :

$$\displaystyle y\ f = f\ (y\ f) \ \ \text{ for all } f$$

The Curry combinator :
$$\displaystyle Y \; ≝ \; \; \lambda f.(\lambda x.f\ (x\ x))\ (\lambda x.f\ (x\ x))$$
The Turing combinator :
$$\displaystyle 𝛳 \; ≝ \; \; \big(Ξ»f. Ξ»g. \, g \, (f \, f \, g) \big) \big(Ξ»f. Ξ»g. \, g \, (f \, f \, g)\big)$$




$$\begin{align*} Y\ f & = (\lambda f'.(\lambda x.f'\ (x\ x))\ (\lambda x.f'\ (x\ x)))\ f \\ & = (\lambda x.f\ (x\ x))\ (\lambda x.f\ (x\ x)) \\ & = f (\underbrace{(\lambda x.f\ (x\ x))\ (\lambda x.f\ (x\ x))}_{= Y\ f})\end{align*}$$

$$fact \; n = \begin{cases} 1 \text{ if } n=0 \\ n Γ— (fact \, (n-1)) \text{ else}\end{cases}$$


$$F f \; ≝ \; \; πœ†f. n \, (πœ†x. \, n Γ— f (Pred \, n)) \, 1 \\Succ \, n \; ≝ \; \; πœ†n f x. f(n f x) \\Pred \, n \; ≝ \; \; πœ†n. πœ‹_1 \Big( n \, \big(πœ†c. βŸ¨πœ‹_2 c, Succ \, πœ‹_2 c⟩\big) \, ⟨0,0⟩ \Big) \\n Γ— m \; ≝ \; \; πœ†n, m. n (m f)$$

$$\begin{align*} fact & = YF \\ & = πœ†f. (πœ†x. f (xx)) (πœ†x. f (xx)) \Bigg( πœ†f. n \, \Big(πœ†x. \, n Γ— f \Big(πœ‹_1 \\ &\bigg( n \, \big(πœ†c. βŸ¨πœ‹_2 c, Succ \, πœ‹_2 c⟩\big) \, ⟨0,0⟩ \bigg) \Big)\Big) \, 1 \Bigg) \\\end{align*}$$

II.3 - Curry-Howard isomorphism

It’s great, but… you can’t do that in Haskell (for example) :

y :: (a -> a) -> a
y = \f -> (\x -> f (x x)) (\x -> f (x x))

⇓

Occurs check: cannot construct the infinite type: r0 ~ r0 -> t
Expected type: r0 -> t
  Actual type: (r0 -> t) -> t

WHY?

Because Haskell (and other functional programming languages such as OCaml, Lisp, Clojure, etc…) is strongly typed.


What is a type ? :
A type is a property of a program.

Ex :

  • the type of the $πœ†$-term $\lceil 1 \rceil$ can be be thought as "Integer" : int.
  • the program $fact$ takes a natural number and produces
    another natural number : its type is $int ⟢ int$

The CURRY-HOWARD CORRESPONDENCE :

Proofs can be represented as programs ($πœ†$-terms), and the formulas it prove are the types for the program

Intuitionistic implicational natural deduction Lambda calculus type assignment rules
$$\displaystyle \frac{}{\Gamma_1, \alpha, \Gamma_2 \vdash \alpha} \text{Ax}$$
$$\displaystyle \frac{}{\Gamma_1, x:\alpha, \Gamma_2 \vdash x:\alpha}$$
$$\displaystyle\frac{\Gamma, \alpha \vdash \beta}{\Gamma \vdash \alpha \rightarrow \beta} \rightarrow I$$
$$\displaystyle\frac{\Gamma, x:\alpha \vdash t:\beta}{\Gamma \vdash \lambda x.t : \alpha \rightarrow \beta}$$
$$\displaystyle\frac{\Gamma \vdash \alpha \rightarrow \beta \qquad \Gamma \vdash \alpha}{\Gamma \vdash \beta} \rightarrow E$$
$$\displaystyle\frac{\Gamma \vdash t:\alpha \rightarrow \beta \qquad \Gamma \vdash u:\alpha}{\Gamma \vdash t\;u:\beta}$$

From a logical standpoint, the $Y$ combinator corresponds to the Curry paradox.




β€œIf this sentence is true then I am an alligator” is true : why ?



Let’s suppose that the sentence is true : it must be shown that I am an alligator.

But since the sentence is true, modus ponens can be applied, which yields the conclusion.