Lecture 6: Dependent type theory
Teacher: Gilles Dowek
20 years ago: dependent type theory was thought to be going to replace simple type theory. But not quite true, today:
-
Dependent Type Theory:
- Coq
- Adga
- Lean
- Matita
-
Simple Type Theory:
- HOL Light (Hales’ theorem, aka Kepler theorem (conjectured in 1610, proved in HOL in 1999))
- Isabelle/HOL
- PVS (used by NASA)
The term $λx.x: ι → ι$ and the proof $λα \, α: A ⇒ A$ have nothing to do with one another ⟶ Why not merge the two notions? (we would have one notion of substitution, type checking, etc…)
If everything is mixed: some formulations of the axiom of choice become a theorem!
Dependently typed $λ$-calculus
Example: types array n
, where n
is the size of the array
NB: In Pascal, arrays had their size in their type, but also their range of indices (array 0 99
: indices ranging from 0
to 99
)
$\prod$ binds the variable $n$ in the target type.
When the set of types is given by a context-free grammar, as in
\[A \; ≝ \; ι \; \mid \; o \; \mid \; A → A\]you don’t need typing rules (the grammar is sufficient), but here the grammar is not context-free ⟹ we need typing rules (as for term formation)
But types are now considered as terms, and all types have type $Type$
Judgments:
\[Γ ⊢ t:A \qquad Γ ⊢ t:A\\ Γ \text{ well-formed}\]Typing rules of $λ\Pi$-calculus (also called LF)
\[\cfrac{}{[] \text{ well-formed}}\] \[\cfrac{Γ \text{ well-formed}}{Γ, A: Type \text{ well-formed}}\] \[\cfrac{Γ \text{ well-formed}}{Γ ⊢ x: A} \quad x:A ∈ Γ\] \[\cfrac{Γ ⊢ A: Type \qquad Γ ⊢ B: Type}{Γ ⊢ A → B: Type}\]These definitions replace simple types. Ground types ($ι, o$) are variables of type $Type$.
ST $λ$-calculus rules
\[\cfrac{Γ ⊢ A: Type}{Γ ⊢ x: A \text{ well-formed}} \quad x:A ∈ Γ\] \[\cfrac{Γ ⊢ A: Type \qquad Γ, x:A ⊢ B: Type \qquad Γ, x:A ⊢ t: B}{Γ ⊢ λx:A. t: A → B}\] \[\cfrac{Γ ⊢ t: A → B \qquad Γ ⊢ t': A}{Γ ⊢ (t \, t'): B}\]Example:
\[\infer[λ]{⊢ λx:nat. x: nat → nat}{ \infer[var]{x:nat ⊢ x:nat}{\phantom{x:nat ⊢ x:nat}} }\]now turned into
\[\infer{nat:Type ⊢ λx:nat.x: nat → nat}{ & nat:Type ⊢ nat:Type & \infer{nat: Type, x:nat ⊢ nat: Type}{ \infer{nat: Type, x:nat \text{ well-formed}}{\phantom{nat: Type, x:nat \text{ well-formed}}} } & \infer{nat: Type, x:nat ⊢ x:nat}{ \infer{nat: Type, x: nat \text{ well-formed}}{ \infer{nat: Type ⊢ nat: Type}{ \infer{nat: Type \text{ well-formed}}{ \infer{[] \text{ well-formed}}{\phantom{[] \text{ well-formed}}} } } } } }\]To come back to the previous example: array
has type $nat → Type$. So $nat → Type$ is a type, hence of type $Type$
Similarly, $array \; 0$ has type $Type$, hence $Type$ must have type $Type$ ⟹ Girard’s paradox (based on Burali-Forti’s paradox)
⟶ New constant: $Kind$ for the types $Type$, $nat → Type$, etc… Then, two solutions:
- either we stop there and $Kind$ has no type
- or $Kind$ has a type ⟶ universe hierarchy
Beware of naming conventions:
Type | Kind |
---|---|
Prop | Type |
Set | Type |
$\ast$ | $\square$ |
Warning!
- $λx:nat. array \; x$ is of type $nat → Type$
- $\prod\limits_{ x:nat } array \; x$ is of type $Type$
So can form products:
- from $Type$ to $Kind$
- from $Type$ to $Type$
but what about $Kind → Kind$ and $Kind → Type$? ⟹ Calculus of Constructions
$=$ is translated into $nat → nat → Type$, that is: $n = n$ is a $Type$, whose inhabitants are proofs of $n=n$
Similarly: $∀x \; (P \, x) ⇒ (P \, x)$ is translated into $\prod\limits_{ n: nat } ((P \, x) → (P \, x))$, a inhabitant of which is $λx: nat. λ y: P(x). y$
Logical Framework
- 1879: Frege’s Begriffschrift, 1902: Russel’s Principia Mathematica, ZF set theory, Coquand’s CoC, etc… ⟶ theories defined from scratch (e.g. you can define natural numbers but not Peano arithmetic)
- 1928: Hilbert’s and Ackermann’s Predicate logic / 1991: Plotkin’s $λ\Pi$-calculus (Edinghburgh Logical Framework): formalism not defined from scratch ⟶ there’s one framework, and theories/logics are defined based on this framework (e.g. you can define Peano arithmetic)
$λ\Pi$ + inductive types ⟶ MLTT $λ\Pi$ + polymorphism ⟶ CoC
Example: A proof build by induction applied to a particular number amounts to a direct proof of the property for that number.
Example: slide 10 of p11.pdf
If you don’t have $N(x)$, you can axiomatize the induction scheme, but with two extra reduction rules:
\[Rec(c,π, π', S(S(0))): S(S(0)) ∈ c\]but also:
\[Rec(c,π, π', S(0)): S(0) ∈ c\\ π' \, S(0) \, Rec(c,π, π', S(0)): S(S(0)) ∈ c\]but also:
\[π: 0 ∈ c\\ π' \, 0 \, π: S(0) ∈ c\\ π' \, S(0) \, \underbrace{(π' \, 0 \, π)}_{= j (⟨1, σ''⟩)}: S(S(0)) ∈ c\]⟶ Gödel System T: (Gödel (1956), Tait (1967))
In ST $λ$-calculus with $0$ and successor: you can only define constant functions and the ones which add a constant to one of its arguments.
Primitive recursive functions:
At the beginning:
- $λ x_1 ⋯ λ x_n. 0$: constant $0$
- $λ x_1 ⋯ λ x_n. x$: adding the constant $0$
- $λ x. S(x)$: adding the constant $1$
- closed by composition
And if you add $Rec$ ⟶ you get all the primitive recursive functions (not Turing complete, but they all terminate), and even more.
\[Rec^{nat} (0, λx,y. SSy, SS0) \\ ⟶ (λx,y. SSy) \, (S0) \, Rec^{nat} (0, λx,y. SSy, S0) \\ ⟶ SS Rec^{nat} (0, λx,y. SSy, S0)\\ ⟶ SS ((λx,y. SSy) \, 0 \, Rec^{nat} (0, λx,y. SSy, 0))\\ ⟶ SS SS Rec^{nat} (0, λx,y. SSy, 0))\\ ⟶ SS SS 0\]But you don’t have all recursive functions: you can’t program an interpreter of Gödel System T.
Termination of Gödel System T ⟹ every proof in arithmetic has an irreducible form, that terminates with an introduction rule ⟹ consistency of arithmetic
Tait introduced the sets $R_T$ to prove termination of Gödel System T (1967).
MLTT = $λ\Pi$ + explicit reduction rule + System T
- $refl$: introduction of equality
- $Leibniz$: elimination of equality
Add a new rewriting rule to eliminate Leibniz cut.
But with
\[x=y ⟶ ∀c \; (x ∈ x ⇒ y ∈ c)\]⟹ $β$-reduction is doing the job for us
Why do Martin-Löf want to avoid the rewriting rules and use aximatization instead?
\[N(x) ⟶ ∀c, … x=y ⟶ ∀c \; (x ∈ x ⇒ y ∈ c)\]⟹ Because Martin-Löf (and the Swedish school) wants to avoid impredicativity at all costs
One example where you don’t have the
\[x ∈ c ⟶ x=0 ∨ ∃ y \; (x= S y)\]Then, we can prove:
\[π: 0 ∈ c\\ π': ∀ x \, (x ∈ c ⇒ S(x) ∈ c)\\ Rec(c, π, π', SS0)\]If instead of $SS0$, we plug $w$: $Rec(c, π, π’, w)$, the term cannot be reduced. It irreducible but doesn’t end with an introduction rule.
⟹ In MLTT: you don’t have full witness property, but only for closed terms.
Lists:
\[Rec (P, b, g, cons(a,x)) ⟶ g \, b \, x \, Rec(P,b,g,x)\]
Leave a comment