Lecture 1: Hilbert Calculus, Natural Deduction, Sequent Calculus

Teacher: Michele Pagani

Linear Logic (LL)

Classical way to introduce it: the fact that the implication arrow $A ⇒ B$ can be decomposed in $!A ⊸ B$. Meaning of formulas: from truth to ressources.

Semantical model: not done in this course. This course: about the syntax.

References:

  • Robert Di Cosmo’s lecture notes (with Michele Pagani: the first 4 chapters approx.)
  • Linear logic, TCS 1987, Girard
  • Wiki-LL (Wikipedia-clone for linear logic maintained by ENS Lyon researchers)
  • Olivier Laurent’s lecture notes about proof-nets (formal way of representing LL proofs as graphs)

In this class: introduction, starting with cut elimination in classical logic.


At the beginning, there was the Hilbert calculus (a way to formalize proofs)

LL Syntax:
\[A, B \, ≝ \, \underbrace{X}_{\text{variable}} | ⊤ | ⊥ | ¬A | A ∧ B | A ∨ B | A ⇒ B\]

Hilbert Calculus (HC)

Axioms of Hilbert calculus:
  • (K) \(A ⇒ B ⇒ A\)
  • (S) \((A ⇒ B ⇒ C) ⇒ (A ⇒ B) ⇒ (A ⇒ C)\)
  • (P(ierce law)) \(((A ⇒ B) ⇒ A) ⇒ A\)

Then: bunch of axioms for the disjunctions, conjunction, etc…

Then, one inference rule: the modus ponens (MP): from $A ⇒ B$ and $A$ we have $B$

Proof in Hilbert calculus:

sequence of formulas $A_1, …, A_n$ such that each formula $A_i$ is

  • either an axiom
  • or the conclusion of a MP with premises $A_j, A_{j’}$ for $j, j’ < i$

Coherence of a theory: $\not\vdash ⊥$. Hilbert wanted to show the coherence of mathematics.

NB: With (K) and (S) ⟶ partial recursive functions (Curry’s combinatorial logic).

With

  • K, S: completeness with respect to intuitionistic logic (LJ)
  • K, S, P: completeness with respect to classical logic (LK)

NB: Pierce’s law is used to type control operators for example.

Example in HC: Prove $⊢ A ⇒ A$

  • S with $B ← (B ⇒ A)$ and $C ← A$:
  • then K with $B ← (B ⇒ A)$
  • then MP ⟶ we have $(A ⇒ B ⇒ A) ⇒ (A ⇒ A)$
  • then K
  • and finally MP

Then a few years later in his PhD thesis:

Gentzen’s Natural Deduction

Notation: NJ is the intuitionistic version, NK is the classical one (with reduction ad absurdum on top of the other rules)

A proof:

is a tree whose leafs are the hypotheses $A_1, …, A_n$ and the root (conclusion) is the proved formula $B$. And as usual, you have deduction rules to go from one child to its parent.

Inference rules:

  • introduction of $∧$ and $⇒$
  • elimination (left and right) of $∧$ and elimination of $⇒$ (modus ponens)

NB: symmetry (except for reduction ad absurdum), and introduction/elimination rules very convenient

Subformula property (SubFP):

If a formula $B$ is provable, there exists a proof such that all the formulas in this proof are subformulas of $B$

The subformula property doesn’t hold for elimination rules.

SubFP: comes in handy to show coeherence, as there is no subformula of $⊥$

Sequent Calculus (SC) for classical logic (LK)

Sequent:
\[\underbrace{A_1, …, A_n}_{\text{hypotheses}} ⊢ \underbrace{B_1, …, B_m}_{theses}\]

Notation: the classical version of SC is denoted by LK, and LJ is the intuitionistic one: in LJ, $m ≤ 1$

Interpretation of a such a sequent:

\[A_1 ∧ ⋯ ∧ A_n ⟹ B_1 ∨ … ∨ B_n\]

Sequent calculus: we have sequents $Γ ⊢ Δ$

Rules

Identities

\[\cfrac{}{A ⊢ A}\text{ax}\]

And introduction rules for connectives on the left and on the right:

Conjunction

\[\cfrac{Γ, A, B ⊢ Δ}{Γ, A ∧ B ⊢ Δ}∧\text{L}\]

NB: this rule can be equivalently replaced by these two rules:

\[\cfrac{Γ, A ⊢ Δ}{Γ, A ∧ B ⊢ Δ}∧\text{L}_1 \qquad \cfrac{Γ, B ⊢ Δ}{Γ, A ∧ B ⊢ Δ}∧\text{L}_2\]

Additive version of $∧\text{R}$:

\[\cfrac{Γ ⊢ A, Δ \qquad Γ ⊢ B, Δ}{Γ ⊢ A ∧ B, Δ}∧\text{R (additive)}\]

Other version (multplicative version):

\[\cfrac{Γ' ⊢ A, Δ' \qquad Γ'' ⊢ B, Δ''}{Γ', Γ'' ⊢ A ∧ B, Δ',Δ''}∧\text{R (multplicative)}\]

Disjunction

\[\cfrac{Γ ⊢ A, B, Δ}{Γ ⊢ A ∨ B, Δ}∨\text{R}\]

And similarly:

\[\cfrac{Γ, A ⊢ Δ \qquad Γ, B ⊢ Δ}{Γ, A ∨ B ⊢ Δ}∨\text{L (additive)}\] \[\cfrac{Γ', A ⊢ Δ' \qquad Γ'', B ⊢ Δ''}{Γ', Γ'', A ∨ B ⊢ Δ', Δ''}∨\text{L (multplicative)}\]

Implication

Based on $A ⇒ B = ¬A ∨ B$

\[\cfrac{Γ ⊢ A, Δ \qquad Γ, B ⊢ Δ}{Γ, A ⇒ B ⊢ Δ}⇒\text{L (additive)}\] \[\cfrac{Γ' ⊢ A, Δ' \qquad Γ'', B ⊢ Δ''}{Γ', Γ'', A ⇒ B ⊢ Δ', Δ''}⇒\text{L (multplicative)}\]

NB:

  • Via the Curry-Howard correspondence: ND is isomorphic to simply-typed $λ$-calculus.
  • In Sequent Calculus: the primitives are more “low-level” ones than in ND (let constructions, etc…)

Th:

  • [Easy] If you have a proof $Γ ⊢ A$ in SC, then there exists a ND proof of $A$ using the hypotheses $Γ$

NB: the SC rules satisfy the SubF property, contrary to ND rules. We can show a SC rule that don’t satisfy this SubF property (to show completeness and the previous theorem): the

Cut (use of lemmas in mathematics):

\(\cfrac{Γ ⊢ A, Δ \qquad Γ, B ⊢ Δ}{Γ ⊢ Δ}\text{ cut (additive)}\) \(\cfrac{Γ' ⊢ A, Δ' \qquad Γ'', B ⊢ Δ''}{Γ', Γ'' ⊢ Δ', Δ''}\text{ cut (multiplicative)}\)

NB: the cut is considered as an identity rule (like axiom). It can be obtained by applying $⇒$L and then weakening of the tautology $A ⇒ A$.

Structural Rules

\[\cfrac{Γ, A, B, Γ' ⊢ Δ}{Γ, B, A, Γ' ⊢ Δ}\text{ exchange L}\] \[\cfrac{Γ ⊢ Δ, A, B, Δ'}{Γ ⊢ Δ, B, A, Δ'}\text{ exchange R}\]

NB:

  • there are weaker logics where the exchange rules don’t hold anymore (non commutative LL, etc…).
  • you can define sequents with multisets (not sets, otherwise we have contraction as well, that don’t hold in LL) to avoid mentioning the exchange rules

And then Structural Rules that don’t stand anymore in LL:

\[\cfrac{Γ ⊢ Δ}{Γ, A ⊢ Δ}\text{ weakening L}\] \[\cfrac{Γ ⊢ Δ}{Γ ⊢ Δ, A}\text{ weakening R}\] \[\cfrac{Γ, A, A ⊢ Δ}{Γ, A ⊢ Δ}\text{ contraction L}\] \[\cfrac{Γ ⊢ A, A, Δ}{Γ ⊢ A, Δ}\text{ contraction R}\]

To prove ND’s $∧E_1$ in SC, we can do as follows:

\[\cfrac{Γ ⊢ A ∧ B \qquad \cfrac{\cfrac{A ⊢ A}{A, B ⊢ A}}{A ∧ B ⊢ A}}{Γ ⊢ A}\text{ cut}\]

To prove ND’s $⇒E$:

\[\cfrac{Γ ⊢ A ⇒ B \qquad \cfrac{Δ ⊢ A \qquad \cfrac{}{B ⊢ B}}{Δ, A ⇒ B ⊢ B}}{Γ, Δ ⊢ B} \text{ cut}\]

NB:

  • Always: cut, then left, then axiom!
  • moving from ND to SC is a way to move from functional programming to a first-order language

Gentzen’s theorem

Th (Gentzen): If $Γ ⊢ Δ$ is provable in LK (resp. LJ), then there exists a cut-free proof in LK (resp. LJ) of $Γ ⊢ Δ$

Corollary: this implies consistency, as a cut-free proof satisfies the subformula property, and $⊢ ⊥$ cannot be provable as $⊥$ as no non-trivial subformula

Sketch of the proof: Gentzen eliminates cuts one after the other (cut-reduction process): when eliminating a cut, you may create even more cuts, but these are “simpler” in a way than the one that has been eliminated. In other words, the cut-reduction process terminates (only if to choose the cut-reduction in a particular order: the rewriting system is weakly normalizing). Cut-reduction for multplicative (resp. additive) $A ∧ B$ corresponds to routing (resp. branching/if-then-else) in programming.

LL Syntax

LL formulas:
\[\begin{align*} A, B ≝ & \, X \,|\, \overbrace{A^⊥}^{\text{negation or dual}} \\ & |\, \underbrace{1}_{\text{one}} \,|\, \underbrace{⊥}_{\text{bottom}} \,|\, \underbrace{⊗}_{\text{tensor}} \,|\, \underbrace{⅋}_{\text{parr}} \,|\, \underbrace{A ⊸ B}_{≝ \; A^⊥ ⅋ B} &&\text{(multiplicatives)} \\ & |\, \underbrace{⊤}_{\text{top}} \,|\, \underbrace{0}_{\text{zero}} \,|\, \underbrace{\&}_{\text{with}} \,|\, \underbrace{⊕}_{\text{plus}} &&\text{(additives)}\\ & |\, \underbrace{!A}_{\text{of course}} \,|\, \underbrace{?A}_{\text{why not}} &&\text{(exponential modalities)}\\ \end{align*}\]
    Negation
Mult. $1$ $⊥ = 1^⊥$
Addit. $⊤$ $0$
  • $1, 0$: positives
  • $⊤, ⊥$: negatives
    Negation
Mult. $⊗$ $⅋$
Addit. $\&$ $⊕$
  • $⊗, ⊕$: positives (non reversible)
  • $⅋, \&$: negatives (reversible conjunction/disjunction: to do proof search, you can automatically guess the premises)

Sequents: as before, but the interpretation changes: $A_1, …, A_n ⊢ B_1, …, B_m$ is understood at the multiplicative level, i.e. as

\[A_1 ⊗ … ⊗ A_n ⊸ B_1 ⅋ … ⅋ B_m\]

NB: the cut elimination won’t work if we regard sequents at the additive level (with $\&$ and $⊕$).

Rules

Identities

\[\cfrac{}{A ⊢ A}\text{ax}\] \[\cfrac{Γ' ⊢ A, Δ' \qquad Γ'', A ⊢ Δ''}{Γ', Γ'' ⊢ Δ', Δ''}\text{ cut}\]

Negation:

\[\cfrac{Γ ⊢ A, Δ}{Γ, A^⊥ ⊢ Δ}⊥\text{L} \quad \cfrac{Γ, A ⊢ Δ}{Γ ⊢ A^⊥, Δ}⊥\text{R}\]

Multiplicatives (MLL: Multiplicative Linear Logic):

\[\cfrac{Γ, A, B ⊢ Δ}{Γ, A ⊗ B ⊢ Δ} ⊗\text{L} \qquad \cfrac{Γ' ⊢ A, Δ' \qquad Γ'' ⊢ B, Δ''}{Γ', Γ'' ⊢ A ⊗ B, Δ',Δ''}⊗\text{R}\] \[\cfrac{Γ ⊢ Δ}{Γ, 1 ⊢ Δ}1\text{L} \qquad \cfrac{}{⊢ 1}1\text{L}\] \[\cfrac{}{⊥ ⊢ }⊥\text{L} \qquad \cfrac{Γ ⊢ Δ}{Γ ⊢ ⊥, Δ}⊥\text{R}\] \[\cfrac{Γ',A ⊢ Δ' \qquad Γ'',B ⊢ Δ''}{Γ', Γ'', A ⅋ B ⊢ Δ',Δ''}⅋\text{L}\qquad \cfrac{Γ ⊢ A, B, Δ}{Γ ⊢ A ⅋ B, Δ} ⅋\text{R}\]

Additives

\[\cfrac{Γ, A ⊢ Δ}{Γ, A \& B ⊢ Δ}\&\text{L}_1 \quad \cfrac{Γ, B ⊢ Δ}{Γ, A \& B ⊢ Δ}\&\text{L}_2 \qquad \cfrac{Γ ⊢ A, Δ \qquad Γ ⊢ B, Δ}{Γ ⊢ A \& B, Δ}\&\text{R}\] \[\cfrac{}{Γ ⊢ ⊤, Δ}⊤\text{R} \quad \cfrac{}{Γ, 0 ⊢ Δ}0\text{L}\] \[\cfrac{Γ ⊢ A, Δ \qquad Γ ⊢ B, Δ}{Γ, A ⊕ B ⊢ Δ}\&\text{L} \qquad \cfrac{Γ ⊢ A, Δ}{Γ ⊢ A ⊕ B, Δ}\&\text{R}_1 \quad \cfrac{Γ ⊢ B, Δ}{Γ ⊢ A ⊕ B, Δ}\&\text{R}_2\]

Then, we proceed analogously for disjunction and implication.

Structural Rules

\[\cfrac{Γ, A, B, Γ' ⊢ Δ}{Γ, B, A, Γ' ⊢ Δ}\text{ exchange L} \qquad \cfrac{Γ ⊢ Δ, A, B, Δ'}{Γ ⊢ Δ, B, A, Δ'}\text{ exchange R}\] \[\cfrac{Γ ⊢ Δ}{Γ, !A ⊢ Δ}\text{ w L} \qquad \cfrac{Γ ⊢ Δ}{Γ ⊢ ?A, Δ}\text{ w R}\] \[\cfrac{Γ, !A, !A ⊢ Δ}{Γ, !A ⊢ Δ}\text{ c L} \qquad \cfrac{Γ ⊢ ?A, ?A, Δ}{Γ ⊢ ?A, Δ}\text{ c R}\]

Exponential modalities:

Notation: if $Γ ≝ A_1, …, A_n$, $!Γ ≝ !A_1, …, !A_n$

\[\cfrac{Γ, A ⊢ Δ}{Γ, !A ⊢ Δ}!\text{ L (dereliction)} \qquad \cfrac{!Γ ⊢ A, ?Δ}{!Γ ⊢ !A, ?Δ}!\text{ R (promotion)}\] \[\cfrac{!Γ, A ⊢ ?Δ}{!Γ, ?A ⊢ ?Δ}?\text{ L} \qquad \cfrac{Γ ⊢ A, Δ}{Γ ⊢ ?A, Δ}?\text{ R}\]

Motivation of these rules: cut elimination. $!$ and $?$ never cut each other (exponential modalities separate the structural rules when it comes to cut elimination).

Leave a comment