Lecture 7 (Master Class): Negative translation

Teacher: Valentin Blot

Negative translation

$NK = NJ + \cfrac{}{Γ ⊢ A ∨ ¬A}$

\not ⊢_{NJ} A ∨ ¬A

But with negative translation: you put $¬$ before each atomic formula

⊢_{NK} ¬¬A ⇒ A\\ ⊢_{NJ} ¬¬¬A ⇒ ¬A

Notation:

A: A^¬ \qquad A^{¬¬} ≝ ¬A^¬
A, B \, ≝ \, P(\vec{x}) \; \mid \; ⊤ \; \mid \; ⊥ \\ A ⇒ B \; \mid \; A ∧ B \; \mid \; A ∨ B \; \mid \; ∀x. \; A \; \mid \; ∃ x \; A

Negative translation:

P^¬ \; ≝ \; ¬ P\\ ⊤^¬ \; ≝ \; ⊥\\ ⊥^¬ \; ≝ \; ⊤\\ (A ⇒ B)^¬ \; ≝ \; ¬A^¬ ∧ B^¬\\ (A ∧ B)^¬ \; ≝ \; A^¬ ∨ B^¬\\ (A ∨ B)^¬ \; ≝ \; A^¬ ∧ B^¬\\ (∀x \, A)^¬ \; ≝ \; ∃x \, A^¬\\ (∃x \, A)^¬ \; ≝ \; ∀x \, ¬¬A^¬\\

Th: \text{If } Γ ⊢_{NK} A \text{ then } Γ^{¬¬} ⊢_{NJ} A^{¬¬}

Sketch of proof: By induction on the proof derivation.

(A ∨ ¬ A)^{¬¬} = ¬(A^¬ ∧ (¬A^¬ ∧ ⊤))
\infer{Γ^{¬¬} ⊢_{NJ} ¬(A^¬ ∧ (¬A^¬ ∧ ⊤))} { \infer{Γ^{¬¬}, A^¬ ∧ (¬A^¬ ∧ ⊤) ⊢_{NJ} ⊥}{ \infer{Γ^{¬¬}, A^¬ ∧ (¬A^¬ ∧ ⊤) ⊢_{NJ} ¬A^¬}{ \vdots } & \infer{Γ^{¬¬}, A^¬ ∧ (¬A^¬ ∧ ⊤) ⊢_{NJ} A^¬}{\vdots}} }

Other case, existential elimination:

\infer{Γ^{¬¬} ⊢_{NJ} ¬B^¬}{ \infer{Γ^{¬¬}, B^¬ ⊢_{NJ} ⊥}{ \infer{Γ^{¬¬}, B^¬ ⊢_{NJ} ¬∀x \, ¬¬A^¬}{\vdots} & \infer[x ∉ fv(Γ^{¬¬}, B^¬) = fv(Γ, B)]{Γ^{¬¬}, B^¬ ⊢_{NJ} ∀x \, ¬¬A^¬}{ \infer{Γ^{¬¬}, B^¬ ⊢_{NJ} ¬¬A^¬}{ \infer{Γ^{¬¬}, B^¬, ¬A^¬ ⊢_{NJ} ⊥}{ \infer{Γ^{¬¬}, B^¬, ¬A^¬ ⊢_{NJ} ¬ B^¬}{\vdots} & \infer[ax]{Γ^{¬¬}, B^¬, ¬A^¬ ⊢_{NJ} B^¬}{\phantom{Γ^{¬¬}, B^¬, ¬A^¬ ⊢_{NJ} B^¬}} } } } } }

Elimination of $∨$:

\infer{Γ^{¬¬} ⊢_{NJ} ¬C^¬}{ \infer{Γ^{¬¬}, C^¬ ⊢_{NJ} ⊥}{ \infer{Γ^{¬¬}, C^¬ ⊢_{NJ} ¬ C^¬}{ \infer{Γ^{¬¬}, C^¬ ⊢_{NJ} ¬A^¬ ⇒ C^¬}{ \infer{Γ^{¬¬}, C^¬, ¬A^¬ ⊢_{NJ} C^¬}{ \vdots } } & \infer{Γ^{¬¬}, C^¬ ⊢_{NJ} ¬A^¬}{ \infer{Γ^{¬¬}, C^¬, A^¬ ⊢_{NJ} ⊥}{ \infer{Γ^{¬¬}, C^¬, A^¬ ⊢_{NJ} ¬ C^¬}{ \infer{Γ^{¬¬}, C^¬, A^¬ ⊢_{NJ} ¬ B^¬ ⇒ ¬ C^¬}{ \infer{Γ^{¬¬}, C^¬, A^¬, ¬ B^¬ ⊢_{NJ} ¬ C^¬}{ \vdots } } & \infer{Γ^{¬¬}, C^¬, A^¬ ⊢_{NJ} ¬ B^¬}{ \infer{Γ^{¬¬}, C^¬, A^¬, B^¬ ⊢_{NJ} ⊥}{ \infer{Γ^{¬¬}, C^¬, A^¬, B^¬ ⊢_{NJ} ¬(A^¬ ∧ B^¬)}{ \vdots } & \infer{Γ^{¬¬}, C^¬, A^¬, B^¬ ⊢_{NJ} A^¬ ∧ B^¬}{ \infer{Γ^{¬¬}, C^¬, A^¬, B^¬ ⊢_{NJ} A^¬}{\phantom{Γ^{¬¬}, C^¬, A^¬, B^¬ ⊢_{NJ} A^¬}} & \infer{Γ^{¬¬}, C^¬, A^¬, B^¬ ⊢_{NJ} B^¬}{\phantom{Γ^{¬¬}, C^¬, A^¬, B^¬ ⊢_{NJ} B^¬}} } } } } } & \infer[ax]{Γ^{¬¬}, C^¬, A^¬ ⊢_{NJ} C^¬}{\phantom{Γ^{¬¬}, C^¬, A^¬ ⊢_{NJ} C^¬}} } } } & \infer[ax]{Γ^{¬¬}, C^¬ ⊢_{NJ} C^¬}{\phantom{Γ^{¬¬}, C^¬ ⊢_{NJ} C^¬}} }

Let $𝔸$ be the set of axioms of $PA$

Def:

  • $⊢_{PA}$ iff there exists $Γ ⊆ 𝔸$ s.t. $Γ ⊢_{NK} A$
  • $⊢_{HA}$ iff there exists $Γ ⊆ 𝔸$ s.t. $Γ ⊢_{NJ} A$

Lemma: if $A ∈ 𝔸$, then there exists $Γ’ ⊆ 𝔸$ s.t. $Γ’ ⊢_{NJ} A^{¬¬}$

Sketch of proof:

Note that

⊢_{PA} A ⟹ ∃ \, Γ ⊆ 𝔸; Γ ⊢_{NK} A

and then Γ^{¬¬} ⊢_{NJ} A^{¬¬}

We want $Γ’ ⊆ 𝔸$ st $Γ' ⊢_{NJ} A^{¬¬}$ (and therefore $⊢_{HA} A^{¬¬}$)

The proof is made by case analysis over $A ∈ 𝔸$.

Note that all the arithmetic axioms are either

of the form $A \, ≝ \, ∀ \, \vec{x} \, B(\vec x)$ with $B$ quantifier free

Then $A^¬ = ∃ \vec x \, B^¬(\vec x)$.

Goal: $⊢_{HA} ¬ \, ∃ \vec x \; B^¬(\vec x)$

Fact 1: $⊢_{NJ} ¬ ∃ x \; C \Leftrightarrow ∀ x \; ¬ C$

So it is sufficient to show $⊢_{HA} ∀ \vec x \; ¬ B^¬(\vec x)$

Fact 2: $⊢_{NJ} C ⇒ ¬C^¬$ whenever $C$ is quantifier free (uses $⊢_{NJ} ¬(¬A ∧ B) \Leftrightarrow (¬A ⇒ ¬B)$)

  • $⊢_{NJ} B(\vec x) ⇒ ¬B(\vec x)^¬$
  • $∀ \vec x \, B(\vec x) ⊢_{NJ} ∀ \vec x \, ¬B(\vec x)^¬$
  • \underbrace{∀ \vec x \, B(\vec x)}_{A} ⊢_{NJ} ¬ ∃ \vec x \, B(\vec x)^¬ \; ≝ \; (∀ \vec x \, B(\vec x))^{¬¬}

or of the form $A \, ≝ \, B(0) ⇒ ∀ x \, (B(x) ⇒ B(S \, x)) ⇒ ∀x \, B(x)$ (induction scheme)

in which case we have

\underbrace{¬B^¬(0) ⇒ ∀x \, (¬B^¬(x) ⇒ ¬B^¬(S \, x)) ⇒ ∀x \, ¬B^¬(x)}_{\text{induction on } ¬B^¬ ∈ 𝔸} ⊢_{NJ} (B(0) ⇒ ∀ x \, (B(x) ⇒ B(S \, x)) ⇒ ∀x \, B(x))^{¬¬}

NB: Do we have $⊢_{NJ} A ⇒ ¬A^¬$? NO!

but we do have

\lbrace ∀ x \, ¬¬A ⇒ ¬¬ ∀ x \, A \; \mid \; A \text{ is a formula}\rbrace ⊢_{NJ} B ⇒ ¬B^¬

Yet

\not ⊢_{NJ} ∀ x \, ¬¬A ⇒ ¬¬ ∀ x \, A \qquad \text{(double negation shift (DNS))}

Excluded-middle

We add a new term

\cfrac{}{Γ ⊢ em: A ∨ ¬A}

Th: If $Γ ⊢_{NK} A$, then $Γ^{¬¬} ⊢_{NJ} ¬A^¬$

Th: If $Γ ⊢_{NK} π: A$, then $Γ^{¬¬} ⊢_{NJ} \underline π: ¬A^¬$

where

\underline α \; ≝ \; α\\ \underline I \; ≝ \; λk:⊥.k\\ δ_⊥ (π) \; ≝ \; λk:A^¬. \underline π I\\ \underline {δ(π, α: A. π', β: B. π'')} \; ≝ \; λk: C^¬. (λα: ¬A^¬. \underline {π'}) \, (λk': A^¬. (λβ: ¬B^¬. \underline {π''}) (λk'': B^¬. \underline π. ⟨k', k''⟩) k) \, k\\ \underline i (π) = λk: A^¬ ∧ B^¬. (λα: ¬A^¬. \underline π) \, fst(k) \, snd(k)\\ \underline {ππ'} \; ≝ \; λk:B^¬. \underline π \, ⟨\underline{π'}, k⟩

We can show that

π ⟶^\ast π' \quad \text{ then } \quad \underline π ⟶^\ast \underline{π'}
\underline{δ(j(π), α: A. π', β: B. π'')} \; ≝ \; λk: C^¬. (λ α:¬A^¬. \underline {π'})(⋯)k ⟶ λk: C^¬. \underline{π'} k

And by setting $(A ∨ B)^¬ \; ≝ \; ¬¬A^¬ ∧ ¬¬B^¬$, we can afford not to make any arbitrary choice.

So:

\underline{δ(π, α: A. π', β:B.π'')} \; ≝ \; λk: C^¬. \underline{π} \, ⟨λα: ¬ A^¬. \underline{π'} k, λβ: ¬ B^¬. \underline{π''} k⟩\\ \underline{i(π)} \; ≝ \; λk: ¬¬A^¬ ∧ ¬¬B^¬. fst(k) \underline{π}
\underline{δ(i(π), α: A. π', β:B.π'')} = λk. (λk: ¬¬A^¬ ∧ ¬¬B^¬. fst(k) \underline{π}) \, ⟨λα: ¬ A^¬. \underline{π'} k, λβ. \underline{π''} k⟩\\ ⟶ λk. fst \, ⟨λα: ¬ A^¬. \underline{π'} k, λβ. \underline{π''} k⟩ \; \underline{π} ⟶ (λα: ¬ A^¬. \underline{π'} k) \underline{π}\\ ⟶ λk. (\underline{π}/α) \underline{π'} k

Th: if $π ⟶^\ast π’$, then $\underline π ⟶^\ast λk. \underline{π’} k$ (for $k ∉ fv(\underline{π’})$)

In the theorem

If $Γ ⊢_{NK} A$, then $Γ^{¬¬} ⊢_{NJ} ¬A^¬$

we can prove that the proof of $Γ^{¬¬} ⊢_{NJ} ¬A^¬$ does not contain

\cfrac{Γ' ⊢ ⊥}{Γ' ⊢ B}

So we can have a variant, by replacing $⊥$ by an arbitrary $R$:

A^{RR} \; ≝ \; A^R ⇒ R\\ (A ∧ B)^R \; ≝ \; A^R ∨ B^R\\ (P(\vec x))^R \; ≝ \; P(\vec x) ⇒ R

Syntactic sugar: $¬^R A \; ≝ \; A ⇒ R$

$Γ ⊢_{NK} A$ implies $Γ^{RR} ⊢ A^{RR}$


⊢_{PA} ∃ x \, t(x) = 0
⊢_{HA} ¬^R ∀ x \, ¬^R ¬^R ¬^R (t(x) = 0)
⊢_{NJ} ¬^R ¬^R ¬^R A \Leftrightarrow ¬^R A

So it is sufficient to show:

⊢_{HA} ¬^R ∀ x \, ¬^R (t(x) = 0)

But as

⊢_{NJ} ∀ x \, ¬^R A \Leftrightarrow ¬^R ∃ x \, A
⊢_{HA} ¬^R ¬^R ∃ x \, (t(x) = 0) \; ≝ \; (∃ x \, (t(x) = 0) ⇒ R) ⇒ R

And by setting $R \; ≝ \; ∃x \, (t(x)=0)$:

⊢_{HA} (∃ x \, (t(x) = 0) ⇒ ∃ x \, (t(x) = 0)) ⇒ ∃ x \, (t(x) = 0)

which provides a constructive proof:

⊢_{HA} ∃ x \, (t(x) = 0)

So we started with a classical proof, and built a witness!


If $⊢_{PA} ∀ x \, ∃ y \, t(x,y) = 0$ then $⊢_{HA} ∀ x \, ∃ y \, t(x,y) = 0$

So you can extract an algorithm $a: ℕ ⟶ ℕ$ st for every $n∈ ℕ$, $t(n, a(n)) = 0$

But you have to restrict yourself to arithmetic, you can’t use the (dependent) axiom choice for example!

Leave a comment