- Definition
- $πΌ$, $π$-conversions and $π½$-reduction
- Alligators
- Turing-completeness of $π$-calculus
- Curry and Turing combinators
- Curry-Howard correspondence
Let $π± \; β \; \; \lbrace x, y, z, \ldots \rbrace$ be a fixed set of variables
- $π¬$ expressions :
-
$$Ξ ::=V |Ξ Ξ |Ξ»VΒ·Ξ$$
the smallest set of functions such that
- $π± \; β π¬$
- Application : $βu, vβπ¬, \underbrace{uv}_{\text{interpreted as } u(v)} β π¬$
- $π$-abstraction : $βxβπ±, uβπ¬, \underbrace{πx. u}_{\text{interpreted as } x \mapsto u} β π¬$
NB : Such a set exists : apply Knaster-Tarski to the following monotonic function in the complete lattice $π«(π΄^\ast)$, where $π΄ \; β \; \; \lbrace π, . \rbrace βͺ π±$
$$\begin{cases} π«(π΄^\ast) βΆ π«(π΄^\ast) \\ π¬ \mapsto \lbrace uv \rbrace_{u,vβπ¬} βͺ \lbrace πx. u \rbrace_{\substack{xβπ± \\ uβπ¬}} βͺ π±\end{cases}$$
- $πΌ$-conversion :
$$\lambda x. u= _\alpha \lambda y. (\underbrace{u[x :=y]}_{\rlap{\text{all instances of $x$ in $u$ are replaced by $y$}}})$$
$x=y$ OR $x, yβ bv(u)$ and $yβ fv(u)$, as illustrated by these examples :
$(Ξ»xΒ·y)[x:=uv] β _πΌ Ξ»uvΒ·y$ |
$(Ξ»xΒ·x)[x:=y] β _πΌ Ξ»xΒ·y$ |
$(Ξ»yΒ·x)[x:=y] β _πΌ Ξ»yΒ·y$ |
$Ξ»xΒ·y[y:=x] β _πΌ Ξ»xΒ·x$ |
- $π½$-reduction :
$$( \lambda x . u ) v \to u [ x := v ]$$
- $π$-conversion :
$$\underbrace{\lambda x . (u x) =_π u}_{\rlap{\text{If $x$ does not appear free in $u$}}}$$
Church Numerals :
$$βnββ, \; \lceil n \rceil \; β \; \; π f x. f^n(x) \; βΉ \; β β \left\lbrace π f x. f^n(x) \right\rbrace$$
-
Constant function :
$$βn, kββ, \; f(x_1,\ldots,x_k) = n \\β‘ π x_1 ,\ldots, x_k. \lceil n \rceil$$
-
Successor function S :
$$S(x) \stackrel{\mathrm{def}}{=} x + 1 \\β‘ π n. f(n f x)$$
-
Projection function :
$$P_i^k(x_1,\ldots,x_k) \stackrel{\mathrm{def}}{=} x_i \\β‘ π x_1 ,\ldots, x_k. \, x_i$$
Operators:
- Composition operator : $π$-abstraction
Now, what about :
- the primitive recursion operator $\rho$ :
$$\begin{align} \rho(g, h) &\stackrel{\mathrm{def}}{=} f \quad\text{where}\\ f(0,x_1,\ldots,x_k) &= g(x_1,\ldots,x_k) \\ f(y+1,x_1,\ldots,x_k) &= h(y,f(y,x_1,\ldots,x_k),x_1,\ldots,x_k)\,.\end{align}$$
- the minimisation operator $\mu$ :
$$\begin{align} \mu(f)(x_1, \ldots, x_k) = z \stackrel{\mathrm{def}}{\iff}\ f(z, x_1, \ldots, x_k)&=0\quad \text{and}\\ f(i, x_1, \ldots, x_k)&>0 \quad \text{for}\quad i=0, \ldots, z-1.\end{align}$$
?
If
$$\mathbf{TRUE} \; β \; \; Ξ»x.Ξ»y.x \\\mathbf{FALSE} \; β \; \; Ξ»x.Ξ»y.y$$
then :
$$\mathbf{AND} \; β \; \; Ξ»p.Ξ»q.p q p \\\mathbf{OR} β Ξ»p.Ξ»q.p\; p \; q \\\mathbf{NOT} \; β \; \; Ξ»p.p \; \mathbf{FALSE} \; \mathbf{TRUE} \\\mathbf{IFTHENELSE} \; β \; \; Ξ»p.Ξ»a.Ξ»b.p \; a \; b \\\mathbf{ISZERO} \; β \; \; Ξ»n. \; n \; (Ξ»x.\, \mathbf{FALSE}) \; \mathbf{TRUE}$$
But⦠what about recursion ?
Astounding fact : each $π$-term actually has a fixed point !
and there even exist fixed-point combinators $y$ :
$$\displaystyle y\ f = f\ (y\ f) \ \ \text{ for all } f$$
- The Curry combinator :
$$\displaystyle Y \; β \; \; \lambda f.(\lambda x.f\ (x\ x))\ (\lambda x.f\ (x\ x))$$
- The Turing combinator :
$$\displaystyle π³ \; β \; \; \big(Ξ»f. Ξ»g. \, g \, (f \, f \, g) \big) \big(Ξ»f. Ξ»g. \, g \, (f \, f \, g)\big)$$
$$\begin{align*} Y\ f & = (\lambda f'.(\lambda x.f'\ (x\ x))\ (\lambda x.f'\ (x\ x)))\ f \\ & = (\lambda x.f\ (x\ x))\ (\lambda x.f\ (x\ x)) \\ & = f (\underbrace{(\lambda x.f\ (x\ x))\ (\lambda x.f\ (x\ x))}_{= Y\ f})\end{align*}$$
$$fact \; n = \begin{cases} 1 \text{ if } n=0 \\ n Γ (fact \, (n-1)) \text{ else}\end{cases}$$
$$F f \; β \; \; πf. n \, (πx. \, n Γ f (Pred \, n)) \, 1 \\Succ \, n \; β \; \; πn f x. f(n f x) \\Pred \, n \; β \; \; πn. π_1 \Big( n \, \big(πc. β¨π_2 c, Succ \, π_2 cβ©\big) \, β¨0,0β© \Big) \\n Γ m \; β \; \; πn, m. n (m f)$$
$$\begin{align*} fact & = YF \\ & = πf. (πx. f (xx)) (πx. f (xx)) \Bigg( πf. n \, \Big(πx. \, n Γ f \Big(π_1 \\ &\bigg( n \, \big(πc. β¨π_2 c, Succ \, π_2 cβ©\big) \, β¨0,0β© \bigg) \Big)\Big) \, 1 \Bigg) \\\end{align*}$$
Itβs great, butβ¦ you canβt do that in Haskell (for example) :
y :: (a -> a) -> a
y = \f -> (\x -> f (x x)) (\x -> f (x x))
β
Occurs check: cannot construct the infinite type: r0 ~ r0 -> t
Expected type: r0 -> t
Actual type: (r0 -> t) -> t
Because Haskell (and other functional programming languages such as OCaml, Lisp, Clojure, etcβ¦) is strongly typed.
- What is a type ? :
- A type is a property of a program.
Ex :
- the type of the $π$-term $\lceil 1 \rceil$ can be be thought as "Integer" :
int
.
- the program $fact$ takes a natural number and produces
another natural number : its type is $int βΆ int$
The CURRY-HOWARD CORRESPONDENCE :
Proofs can be represented as programs ($π$-terms), and the formulas it prove are the types for the program
Intuitionistic implicational natural deduction |
Lambda calculus type assignment rules |
$$\displaystyle \frac{}{\Gamma_1, \alpha, \Gamma_2 \vdash \alpha} \text{Ax}$$ |
$$\displaystyle \frac{}{\Gamma_1, x:\alpha, \Gamma_2 \vdash x:\alpha}$$ |
$$\displaystyle\frac{\Gamma, \alpha \vdash \beta}{\Gamma \vdash \alpha \rightarrow \beta} \rightarrow I$$ |
$$\displaystyle\frac{\Gamma, x:\alpha \vdash t:\beta}{\Gamma \vdash \lambda x.t : \alpha \rightarrow \beta}$$ |
$$\displaystyle\frac{\Gamma \vdash \alpha \rightarrow \beta \qquad \Gamma \vdash \alpha}{\Gamma \vdash \beta} \rightarrow E$$ |
$$\displaystyle\frac{\Gamma \vdash t:\alpha \rightarrow \beta \qquad \Gamma \vdash u:\alpha}{\Gamma \vdash t\;u:\beta}$$ |
From a logical standpoint, the $Y$ combinator corresponds to the Curry paradox.
Letβs suppose that the sentence is true : it must be shown that I am an alligator.
But since the sentence is true, modus ponens can be applied, which yields the conclusion.