# Lecture 7: Static translation

Teacher: Delia Kesner

cf picture

Theorem:

1. If you have $t =_c t’$, then $T(t)$ (which means: the translation of a type derivation of $t$) $≃_A T(t’)$
2. If $t ⟶_{App3, Lamb} t'$, then $T(t) ≃_{A,B} T(t')$
3. If $t ⟶_{λs\backslash \lbrace App3, Lamb\rbrace} t'$, $T(t) ⟶^+_{R/E} W[T(t')]$

Notations:

• $R$: all the reduction rules
• $E$: all the equations for Proof Nets
• $R/E$: “$R$ modulo $E$”

Observation: $⟶_{App3} ∪ ⟶_{Lamb}$: is (strongly) terminating

NB: $⟶_B$ and $⟶_S$ are terminating, but what about the union?

Corollary: From a theorem coming from abstract rewriting theory, we can conclude that the relation $⟶_{λs}$ is strongly normalizing by using the fact that $⟶_{R/E}$ is strongly normalizing.

NB: we are showing normalization of $λs$ by using normalization of proof nets.

$t[x/u][y/v] =_c t[y/v][x/u] \qquad x ∉ fv(v) ∧ y ∉ fv(u)$

Reminder: recall that there are two kinds of “boxes” in proof nets: the substitution boxes and the application ones.

$\infer{⊢ t[x/u][y/v]}{ ⊢ v & \infer{⊢ t[x/u]:A}{ ⊢ u & ⊢ t} }$
• Fist (easy) case: $x, y ∈ fv(t)$

$\infer{⊢ t[x/u][y/v]:A}{ ⊢ v:C & \infer{⊢ t[x/u]:A}{ ⊢ u:B & y: C, x:B ⊢ t:A} }$

Notation: $Γ_{t’, t’’}$: variables of $t’$ and $t’’$

cf picture : write the complete derivation tree

• $(λx. u)[y/v] ⟶_{Lamb} λx.u[y/v]$

First case: $x ∈ fv(u), y ∈ fv(u)$

cf picture

• Other cases: cf. pictures

Inductive Cases:

For example:

• $t = uv ⟶ u'v = t' \text{ where } u ⟶ u'$

Trivial case, since the weakenings arising from $u ⟶ u’$ are outside: they are also weakenings for the whole proof net.

• $t= u[x/v] ⟶ u[x/v’] = t$ where $v ⟶ v’$

IH: $T(v) ⟶^+ W[T(v')]$

To prove:

$T(u[x/v]) ⟶^+ W[T[u[x/v']]]$
• $U$ rule: used for $(zxy)[x'/y] ⟶ zxy$

Reminder:

$λ$-calculus ⟶ $λs$-terms ⟶ Pnets

But what about going the other way round? ⟹ $λlxr$-calculus

## $λlxr$-calculus

Specificity:

• Weakening: $W_x(y)$ allowed, but not $W_x(x)$

• Contraction: $C_x^{y,z}(t)$, where $x$ is free, $y,z$ are bound:

• $C_x^{yz}(x_1 y z)$ allowed
• $C_x^{yz}(x_1 x_1 x_2)$ not allowed
$fv(W_x(t)) = fv(t) ∪ \lbrace x\rbrace\\ fv(C_x^{yz}(t)) = (fv(t) \backslash \lbrace y, z\rbrace) ∪ \lbrace x \rbrace$

### Complusory presence

For all the binders, the variables need to be there to be bound

$λx.y \leadsto λx. W_x(y)\\ x[x'/y] \leadsto W_{x'}(x)[x'/y]$

### Linearity

A term is linear:

iff each variable occurs at most once

$λx.xx$ not allowed:

$λx.xx \leadsto λx. C_x^{x_1, x_2}(x_1 x_2)$

NB: In proof nets, it’s the same: to bound/contract a variable, you need wires to be there in the first place!

Recap examples:

$λx.y \quad \leadsto \quad λx. W_x(y)\\ x[z/w] \quad \leadsto \quad W_z(x) [z/w]\\ C_x^{y,z}(wy) \quad \leadsto \quad C_x^{y,z}(W_z(wy))\\$

Barendregt convention: $λx. x[x/z]$ is ambiguous ⟶ rename

### Typing rules

Weakening and contraction are now built in: there is an explicit notation to denote them

$\cfrac{Γ ⊢ t:A}{Γ, x:B ⊢ W_x(t):A}$

and

$\cfrac{Γ, x:A, y:A ⊢ t:B}{Γ, z:A ⊢ C_z^{x,y}(t):B}$

Linearity: $Γ, Δ$ is well-defined only if $Γ$ and $Δ$ do not share variables.

No contractions anymore in the proof nets, as there are no shared variables in the contexts.

### Equations

Contrary to proof nets, where many thing happen in parallel, now we need to sequentialize.

Rule $A$ for proof nets becomes:

$C_w^{x,v}(C^{z,y}_x(t)) ≡ C_w^{x,y}(C_x^{z,v}(t)) \qquad \text{ if } x≠y, v$

And equality of proof nets is made algebraic:

The fact that wires commute:

$C_x^{y,z}(t) ≡ C_x^{z,y}(t)$

Concurrency of parallel contractions:

$C_{x'}^{y',z'}(C^{y,z}_x(t)) ≡ C^{y,z}_x(C_{x'}^{y',z'}(t)) \qquad \text{ if } x≠y', z' \quad ∧ \quad x' ≠ y, z$

etc…

Commutativity of substitution:

$t[x/u][y/v] ≡ t[y/v][x/u] \qquad x ∉ fv(v) ∧ y ∉ fv(u)$

etc…

### SubSystem $x$

Rewriting rules to deal with explicit substitutions.

As terms are linear, we only have two $App$ rules now.

Example:

$W_x(y_1 y_2)[x/z_1 z_2] ⟶ W_{z_2}(W_{z_1}(y_1 y_2)) \qquad \text{(the order of the } z_i \text{'s don't matter}\\$

NB: in these rules,

• all the substitutions go down
• all the weakenings go up
• all the contractions go down

In $Cont1$: $Δ$ and $\Pi$ are fresh variables. $u_1 = R^Φ_Δ(u)$ where $Φ = fv(u)$ is a meta-operation meaning: “make a copy of $u$, where the free variables of $u$ have been renamed as $Δ$”

Notations: $x$: system pushing down substitutions, $t$: system handling weakening and contractions

Only rule “erasing” variables: $Weak1$, but free variables of $u$ are actually not discarded ⟹ free variables are preserved (weakening keep track of “erased” variables)

$xt = x ∪ t$ is convergent (terminating and confluent) ⟶ it defines a function: a term in normal form is a term where all the weakenings are up and all the contractions are down.

Tags:

Updated: