Lecture 1 : Speedup theorem
Outline
We’ll see complexity classes, such as:
- SPACE classes
- Polynomial time hierarchy
-
Probabilistic classes
-
TM that are probabilistic instead of being non-deterministic. Acceptance is made with some probability.
-
Non-deterministic: you want one accepting run VS probabilistic one with a certain probability ⟶ you either do many runs to have the right answer, OR run on many machines
-
Reminder: Turing Machines ( TM )
Useful for low-level complexity: match the idea of a theoretical computers.
A TM
-
Finitely many states:
, among which:- a starting state
- final states
- rejecting states
- a starting state
- a finite alphabet: denoted by
- the symbols $$
B \square$) is used to express the fact that we’re at the beginning (resp. at the end) of the written tape
- the symbols $$
-
a reading/writing head
- based on the current state and what the head sees, the TM can change the head’s direction and the state
-
rules of behavior
-
A configuration of
is some-
for instance:
means that we’re on , in the state -
OR: we could write
alternatively
-
from
More general: Non-deterministic TM
Instead of having a function
Going from
- A run of TM:
-
is a sequence
(it can be infinite) where ( is the input of the TM) - The run
accepts/recognizes : -
if
( is then said to be accepted/recognized).
For a non-deterministic TM,
is accepted iff there exists a run that accepts .
On can also have several tapes: as it happens
-
there are one input tape,
working tapes, and one output tape- the length of
doesn’t count as used space, it’s supposed given
- the length of
-
one never writes on the input tape
-
one never reads on the output tape, one only writes (the head can only go to the right, on the printer)
Not moving the head
-
for one tape, it doesn’t change anything
-
for several tapes, it’s not trivially the case: if you just want stay put on the first tape only and you have two tapes, you can’t for parity reasons.
-
but all these models are “more or less equivalent”, that is, when it comes to the asymptotic complexity, it doesn’t change anything.
Space complexity
Let
- A Language
is in : -
if there exists a TM
s.t.-
recognizes- so
is on the fixed alphabet of
- so
is deterministic- for every word
, halts using at most space (that is, we add the space used on both tapes)
-
- A Language
is in : -
if there exists a TM
s.t.-
recognizes- so
is on the fixed alphabet of
- so
is NON-deterministic- for every word
, halts using at most space (that is, we add the space used on both tapes)
-
NB: an finite automaton can be seen as a TM using NO working space whatsoever (since everything is stored in the sates).
So regular languages are in
And any TM using no working space is a 2-way automaton (it reads back on the input tape), and it is as expressive as a finite automaton.
So
And even
NB: pebble automata: more expressive than finite automata.
Speedup Theorem
Theorem:
Proof:
Let
Likewise:
Theorem:
Both of these theorems are true for non-deterministic TM as well (put a ‘N’ before each class).
In logarithmic space, one can
-
count the length of the input
-
recognize
, for instance, in (but with the speedup theorem, there’s no problem) -
(a.k.a. (Graph Accessibility Problem)): is there a path from one node to another in a directed graph (DAG)
where
But: we have to
- use pointers to refer to nodes
- use a decrementing counter of the number of edges to avoid cycles
- use non-determinism to guess a path
We can even do that in
# is there a path from s to t of length ≤ 2^p
reach(s, t, p) = | yes if s=t or s→t
| no if p=0
| OR_{for each node v} (reach(s, v, p-1) ∧ reach(v, t, p-1))
call reach(s, t, log |number of edges|)
There are a logarithmic number of recursive calls, each of which takes logarithmic space.
Leave a comment