# Introduction

Paul-André Melliès (mellies at irif.fr)

Mathematics of programming languages ⟶ can we develop a mathematical theory of programms ? (like theortical physics for instance)

It started during the 40’s:

• at the beginning: huge machines

• difference hardware/software: by degrees, software became separated from hardware to abstract
• computer VS calculator (at the beginning): a computer can compute + remember a little so as to start the next calculation with this remainder VS a calculator: just calculate without remembering anything
• +: you can control a computer, not a calculator
• in the 40’s: women were the programmers, gluing pieces of prgramming together
• in the 60’s: idea of programming langugaes with procedures ⟶ programs can call one another, compose, etc… ⟹ higher-order functionals: functions that can take functions as an input and output andother function

Ex of a functional:

$eval: \begin{cases} (ℝ ⟶ ℝ) × ℝ ⟶ ℝ \\ f , a \mapsto f(a) \end{cases}$

a program can call a procedure

⟶ Ch. Starchey

Ex: a sequential program:

$sum: \begin{cases} ℕ × ℕ ⟶ ℕ \\ a , b \mapsto a + b \end{cases}$

how to implement it?

1. left-to-right algorithm
2. right-to-left algorithm

Game semantics of the program:

$lrsum: \begin{cases} ℕ × ℕ ⟶ ℕ \\ \underbrace{q_2}_{\substack{\text{question coming from the environment} \\ ⟶ \text{the environment answers } 5_3}}, \underbrace{q_4}_{\text{the environment says:} 7_5} \mapsto \underbrace{q_1}_{\substack{\text{the first number whose value is asked} \\ ⟶ \text{the environment says: } 12_6}} \end{cases}$

NB: subscripts are meant to indicate the order of execution

in practice: the values $5$ and $7$ are put on the stack

Problem: the order of execution looks like computation, but it can be turned into a mathematical function (it’s whole point of programming languages): with continuations.

Continuations: in order to male a difference between left-to-right and right-to-left, the cartesian product $ℕ × ℕ$ ⟹ a mathematical completion of $ℕ$ can do the trick:

$ℕ \text{ is turned into } ¬¬ℕ$ $¬ A ≝ A ⟹ \underbrace{α}_{\text{variable type}}$

Let us fix a type $\bot$ (thought of as $false$)

$¬ A ≝ A ⟹ \bot$

Maybe now:

$lrsum: ¬¬ℕ × ¬¬ℕ ⟶ ¬¬ℕ$

the order of execution can be thought of as a function

In constructive logic:

$φ ⟹ ¬¬φ$

but

$¬¬φ \not⟹ φ$

It’s the case in classical logic: there’s a long history of retrieving the computable content of classical logic.

$callcc$: invented in Lisp as a hack to recover the current continuation ⟶ so as to come back to where I was.

⟹ it implements the reasoning by contradiction

  graph {
l[label="logics proofs"]; lam[label ="lambda-calculus"];
programming --
l, lam;
l -- lam;
}


Back to $¬¬ℕ × ¬¬ℕ ⟶ ¬¬ℕ$:

$¬¬ ℕ = (ℕ ⟶ X) ⟶ X$

Ex:

$: \begin{cases} (ℕ ⟶ X) ⟶ X \\ f \mapsto f(5) \end{cases} ≝ λf. f(5) ≝ \underbrace{δ_5}_{\text{Dirac function}}$

Dirac function: There is a relationship between the $¬¬ℕ$ and distributions on $ℕ$.

$lrsum: (ℕ ⟹ X) ⟹ X × \big((ℕ ⟹ X) ⟹ X\big) ⟶ (ℕ ⟹ X) ⟹ X$

that is:

$\begin{cases} (ℕ ⟹ X) ⟹ X × \big((ℕ ⟹ X) ⟹ X\big) × (ℕ ⟹ X) ⟶ X \\ φ, ψ, k \mapsto φ\big( λm. ψ(λn. k(n+m))\big) \end{cases} ≝ λφ, ψ, k. \underbrace{φ}_{\text{first call}}\big( λm. \underbrace{ψ}_{\text{second call}}(λn. \underbrace{k(n+m)}_{X})\big)$
• $k$: seen as a function waiting for a natural number, which is then mapped to what will come next (ex: $X$ contains of the possible configurations of the program afterwards)

• we always do that in our daily routine: we try to guess what will happen after a given action (“what will he say if I do that?”)

Similarly:

$rlsum: (ℕ ⟹ X) ⟹ X × \big((ℕ ⟹ X) ⟹ X\big) × (ℕ ⟹ X) ⟶ X ≝ λφ, ψ, k. ψ \big( λm. φ (λn. k(n+m))\big)$

Partial evaluation: if I build an optimized interpreter, in which I plug a program, so that I get a compiled as an output.

We turned an automata-theoretic algorithm into a $λ$-term. This approach can be very successfully generalized.

Another example: how to “blur” databases so that one cannot access all of personal information ⟶ monadic approach / $λ$-calculus-related

Mnemoid: a set with a 0-1 (hidden) register.

# Category theory

A graph:
• a set of vertices $V$
• a set of edges $E$
• two functions $source, target: E ⟶ V$ (often written $\partial_0, \partial_1$)
A (small) category:

a graph of objects (vertices) and arrows/morphisms (edges) equipped with a composition $\circ$, and an identity arrow $1_A$ for each object $A$ such that:

• Associativity of the composition: $(w \circ v) \circ u = w \circ (v \circ u)$
• Identity law: $u \circ id_A = id_A \circ u = u$

If the sets of objects and arrows are replaced by classes: general definition.

NB: you can see the graph $A \overset{u}{⟶} B \overset{v}{⟶} C \overset{w}{⟶} D$ can be seen as a 3-simplex, with the composition arrows.

Associativity canonically fills 3-dimensionally between the two sides $(w \circ v) \circ u$ and $w \circ (v \circ u)$

Beautiful view: the algebra is the art of filling holes.

## Examples of categories

1. Preorder category: Categories where there exists at most one arrow between two objects: every preorder define such a category, and reciprocally (provided that the category is small): such small categories are posets
• preorder: a set equipped with a reflexive and transitive binary relation $≤$
• all the cycles are isomorphisms

Categories generalize the notion of order: very important, since many concepts of order theory can be lifted to category theory.

The notion of greatest lower bound is such a concept ⟶ it’s the product of objects.

Cartesian product of two sets is also a product in the category $Set$.

(cf. the “category theory” section of my L3 report internship)

Cartesian product of $A, B : \vert 𝒞 \vert$:

it is an object $A×B$ and two (projection) maps $π_1: A × B ⟶ A, π_2: A × B ⟶ B$ such that for every object $C$ and maps $f: C ⟶ A, g: C ⟶ B$, there exists a unique map $h: C ⟶ A × B$ such that $\begin{cases} π_1 \circ h = f \\ π_2 \circ h = g \end{cases}$

1. Categories where there exists at most one object: every preorder define such a category: it’s a poset

Tags:

Updated: