Lecture 7: Static translation
Teacher: Delia Kesner
cf picture
Theorem:
 If you have $t =_c t’$, then $T(t)$ (which means: the translation of a type derivation of $t$) $≃_A T(t’)$
 If $t ⟶_{App3, Lamb} t'$, then $T(t) ≃_{A,B} T(t')$
 If $t ⟶_{λs\backslash \lbrace App3, Lamb\rbrace} t'$, $T(t) ⟶^+_{R/E} W[T(t')]$
Notations:
 $R$: all the reduction rules
 $E$: all the equations for Proof Nets
 $R/E$: “$R$ modulo $E$”
Observation: $⟶_{App3} ∪ ⟶_{Lamb}$: is (strongly) terminating
NB: $⟶_B$ and $⟶_S$ are terminating, but what about the union?
Corollary: From a theorem coming from abstract rewriting theory, we can conclude that the relation $⟶_{λs}$ is strongly normalizing by using the fact that $⟶_{R/E}$ is strongly normalizing.
NB: we are showing normalization of $λs$ by using normalization of proof nets.
Reminder: recall that there are two kinds of “boxes” in proof nets: the substitution boxes and the application ones.

Fist (easy) case: $x, y ∈ fv(t)$
\infer{⊢ t[x/u][y/v]:A}{ ⊢ v:C & \infer{⊢ t[x/u]:A}{ ⊢ u:B & y: C, x:B ⊢ t:A} }Notation: $Γ_{t’, t’’}$: variables of $t’$ and $t’’$
cf picture : write the complete derivation tree

$(λx. u)[y/v] ⟶_{Lamb} λx.u[y/v]$
First case: $x ∈ fv(u), y ∈ fv(u)$
cf picture

Other cases: cf. pictures
Inductive Cases:
For example:

t = uv ⟶ u'v = t' \text{ where } u ⟶ u'
Trivial case, since the weakenings arising from $u ⟶ u’$ are outside: they are also weakenings for the whole proof net.

$t= u[x/v] ⟶ u[x/v’] = t$ where $v ⟶ v’$
IH: T(v) ⟶^+ W[T(v')]
To prove:
T(u[x/v]) ⟶^+ W[T[u[x/v']]]  $U$ rule: used for (zxy)[x'/y] ⟶ zxy
Reminder:
$λ$calculus ⟶ $λs$terms ⟶ Pnets
But what about going the other way round? ⟹ $λlxr$calculus
$λlxr$calculus
Specificity:

Weakening: $W_x(y)$ allowed, but not $W_x(x)$

Contraction: $C_x^{y,z}(t)$, where $x$ is free, $y,z$ are bound:
 $C_x^{yz}(x_1 y z)$ allowed
 $C_x^{yz}(x_1 x_1 x_2)$ not allowed
Complusory presence
For all the binders, the variables need to be there to be bound
Linearity
 A term is linear:

iff each variable occurs at most once
$λx.xx$ not allowed:
NB: In proof nets, it’s the same: to bound/contract a variable, you need wires to be there in the first place!
Recap examples:
Barendregt convention: $λx. x[x/z]$ is ambiguous ⟶ rename
Typing rules
Weakening and contraction are now built in: there is an explicit notation to denote them
and
Linearity: $Γ, Δ$ is welldefined only if $Γ$ and $Δ$ do not share variables.
No contractions anymore in the proof nets, as there are no shared variables in the contexts.
Equations
Contrary to proof nets, where many thing happen in parallel, now we need to sequentialize.
Rule $A$ for proof nets becomes:
And equality of proof nets is made algebraic:
The fact that wires commute:
Concurrency of parallel contractions:
etc…
Commutativity of substitution:
etc…
SubSystem $x$
Rewriting rules to deal with explicit substitutions.
As terms are linear, we only have two $App$ rules now.
Example:
NB: in these rules,
 all the substitutions go down
 all the weakenings go up
 all the contractions go down
In $Cont1$: $Δ$ and $\Pi$ are fresh variables. $u_1 = R^Φ_Δ(u)$ where $Φ = fv(u)$ is a metaoperation meaning: “make a copy of $u$, where the free variables of $u$ have been renamed as $Δ$”
Notations: $x$: system pushing down substitutions, $t$: system handling weakening and contractions
Only rule “erasing” variables: $Weak1$, but free variables of $u$ are actually not discarded ⟹ free variables are preserved (weakening keep track of “erased” variables)
$xt = x ∪ t$ is convergent (terminating and confluent) ⟶ it defines a function: a term in normal form is a term where all the weakenings are up and all the contractions are down.
Leave a comment