Computational Geometry Learning: Introduction
Teacher: Marc Glisse
- Computational geometry and topology
- Triangulations, simplicial complexes
- Algorithms in high dimensions
- Shape reconstruction
- Geometric inference
- Topological data analysis
Convex sets
-
Stable by
- intersection
- increasing union
-
Existence of tangent lines at point on the frontier
Convex hull (CH) of a set of points (set of barycenters of those points, or intersection of all convex sets containing those points)
Helly theorem
We’re in $ℝ^d$.
Radon: Consider $d+2$ points. There exist a partition $P \sqcup Q$ of these points such that \(CH(P) ∩ CH(Q) ≠ ∅\)
Proof: $d+2 > d+1$ so the points where you add a new coordinate equal to $1$ (at the end) are linearly dependent:
\[\sum\limits_{ i } a_i p_i = 0\\ \sum\limits_{ i } a_i = 0\]Rewrite the previous lines:
\[\sum\limits_{ a_i > 0 } a_i p_i = \sum\limits_{ a_i < 0 } (-a_i) p_i\\ \sum\limits_{ a_i > 0 } a_i = \sum\limits_{ a_i < 0 } (-a_i)\\\]So
\[\frac{\sum\limits_{ a_i > 0 } a_i p_i}{\sum\limits_{ a_i > 0 } a_i} = \frac{\sum\limits_{ a_i < 0 } (-a_i) p_i}{\sum\limits_{ a_i < 0 } (-a_i)}\]and this point is a barycenter of the two sets of points (on the left and on the right), so it’s part the convex hull of these two sets.
Helly theorem: Consider $n>d$ convex sets in $ℝ^d$.
If all $d+1$ of them intersect, there is a common intersection point for all of them.
Proof: We have $d+2$ convex sets $S_1, …, S_{d+2}$
For all $i$, pick a
\[p_i ∈ \bigcap\limits_{j≠i} S_j\](it exists by hypothesis)
By Radon, there exists:
\[c ∈ CH(P) ∩ CH(Q)\]We will show that $c ∈ S_i \quad ∀ i$
For all $i$:
\[∀ j ≠ i, \qquad p_j ∈ S_i\]So if $p_i ∈ Q$, then $∀ p ∈ P, p ≠ p_i$, so $p ∈ S_i$, and
\[P ⊆ S_i\]so
\[CH(P) ⊆ S_i\]so $c ∈ CH(P) ⊆ S_i$.
Carathéodory:
\[p ∈ CH(P) ⟹ p ∈ CH(d+1 \text{ points of } P)\]
Scan/Graham-Schmidt algorithm: to compute the upper part of the convex hull of a set of points in 2D.
Complexity: $n \log n$ (sorting the points) + $O(n)$ (going through) + $O(n)$ (amortized backtrack)
Lower bound for CH computation: $n \log n$, because if you know how to compute the convex hull, you know how to sort numbers: consider the CH of the parabola $(x_i, x^2_i)$
Jarvis algorithm/gift wrapping: in $O(n)$, but the constant may be big, in which case the Graham-Schmidt algorithm may be more advantageous
-
In a $d$-dimensional ball: pick $n$ points uniformly at random: how many points will land on the CH? ⟶ $n^{\frac{d-1}{d+1}}$
-
Same in a $d$-dimensional square ⟶ $\log^{d-1} n$
Divide and conquer: $O(n \log n)$
Quickhull: $O(n^2)$ in worst case
Timothy Chan’s algorithm: $(n + h \frac n m) \log m$ ⟶ with $m = h$: $2n \log h$
Linear programming: in $O(n)$ in dimension $d$, but warning: the constant is exponential in the dimension
Ambient isotopy
Ambient isotopy: stronger notion than homeomorphism
\[F: X × [0, 1] ⟶ X \quad \text{ continuous}\]- $F_t : X ⟶ X$ homeomorphism
- $F_0 = Id$
- $F_1(X’) = Y’$
then $X’$ and $Y’$ are ambient isotopic (and thus homeomorphic)
⟶ this notion relies a lot on the ambient space (whereas it’s not the case for homeomorphism)
Homotopy
Weaker notion: homotopy equivalence
-
Point: homotopic to a filled circle:
\[\lbrace 0 \rbrace \sim_h \lbrace (x, y) \; \mid \; x^2 + y^2 ≤ 1\rbrace\]- $H(t, x) = tx$ for $0 ≤ t ≤ 1$
-
Circle: homotopic to a “donut”-shaped torus
Deformation retract
$Y ⊆ X$ continuous, $H: [0, 1] × X ⟶ X$
- $H(0, x) = x ∈ X$
- $H(1, X) ⊆ Y$
- $H(t, y) = y ∈ Y$
special case where we know we have homotopy equivalence
Isometry
Bijection that preserves distances
Metric spaces
Euclidian distance vs. Geodesic distance
If I move my arms, the euclidian distance varies a lot, but the goedesic distance stays the same.
Hausdorff distance
- Hausdorff distance:
- \[A, B ⊆ ℝ^d \qquad \max(\sup_{a ∈ A} d(a, B), \sup_{b ∈ B} d(b, A))\]
NB: if you define $d_A, d_B: ℝ^d ⟶ ℝ$: the Hausdorff-distance is equal to \(\Vert d_A - d_B \Vert_∞\)
Gromov-Hausdorff distance
For all metric space $C$, for all subspaces $A, B ⊆ C$, for all $f: A ⟶ C, g: B ⟶ C$ isometries, we can take the minimum of all the Haussdorff distances
\[d_H(f(A), g(B))\]Leading to the definition:
\[d_{GH} = \inf_{A, B, C, f, g} d_H(f(A), g(B))\]But it’s too cumbersome (and touchy: we’re quantifying over all metric spaces (this is not a set)). Another way to define it: we will define a notion of correspondence.
- Correspondence:
-
a relation $C ⊆ A × B$ such that
- for all $a ∈ A$, there exists $b ∈ B$ such that $(a, b) ∈ C$
- for all $b ∈ B$, there exists $a ∈ A$ such that $(a, b) ∈ C$
- $ε$-correspondence:
-
if $(a, b), (a’, b’) ∈ C$,
\[\vert d(a, a') - d(b, b') \vert ≤ ε\]
Good definition of Gromov-Hausdorff distance:
\[d_{GH} \; ≝ \; \inf \lbrace ε \; \mid \; ∃ ε \text{-correspondence}\rbrace\]Geometric simplices
- $k$-simplex $σ$:
-
it is the convex hull of $k+1$ points of $ℝ^d$ that are affinely independent
Abstract (non geometric) definition:
$K$ set of subsets of a set of points $P$ such that
- $∀ p ∈ P, \lbrace p \rbrace ∈ K$
- it is stable under subsets
It can be realized geometrically
Nerve of a finite cover $𝒰 = \lbrace U_1, …, U_n \rbrace$
- Nerve of $𝒰$:
-
the simplices complex $K(𝒰)$ defined by
\[σ \; ≝ \; [U_{i_0}, …, U_{i_k}] ∈ K(𝒰) ⟺ \bigcap\limits_{j=1}^k U_{i_j} ≠ ∅\]
Thm (Good cover, Nerve): for a finite open cover such that all the intersections are empty or contractible (called a good cover), then $K(𝒰)$ is homoptopic to $X$
cf. picture
Very fundamental property: allows us to go from a continuous space to a discrete one (that have the same topology (they’re homotopy equivalent))
Torus:
- $x^2 + y^2 = 1$
- $z^2 + t^2 = 1$
(product of two circles)
\[x^2 + y^2 + z^2 + t^2 = 2\]⟶ so the points are on a 3-sphere (in 4D)
Let $K_1 ⊆ K_2 ⊆ ⋯$ be a sequence sequential complexes (which may be indexed by $ℝ$), let $K_∞$ be the biggest of them, and define:
\[filt(σ) = \inf \lbrace r \; \mid \; σ ∈ K_r\rbrace\]- Cech complex $C(P, r)$:
-
Nerve of the union of the open balls of radius $r$ at each point $p ∈ P$
NB: you need to compute minimum encolsing balls
- Rips complex $R(P, r)$:
-
compute a graph with the pairwise intersections of the open sets, and then fill the cliques with the corresponding simplices (ex: for a clique of 3 points: fill it with a 3-simplex)
For the inclusion $R(P, r) ⊆ C(P, 2r)$: let $p_1, …, p_k ∈ σ ∈ R(P, r)$ some points such that \(d(p_i, p_j) ≤ 2r\)
Then \(p_j ∈ B(p_i, 2r)\)
Triangulations
Stars of a point: all faces of simplices containing this point (you can complete them to get all the simplices that contain this point: $\overline{Star}$)
Link of a point: $\overline{Star} \backslash Star$
A pure $k$-complex: a complex made up of $k$-dimensional simplices and their faces
Triangulation: a pure $k$-complex such that each facet belongs to 1 or 2 simplices and
- the link of any $v ∈ K \backslash \partial K$ is a triangulated $(k-1)$-sphere
- the link of any $v ∈ \partial K$ is a triangulated $(k-1)$-ball
Other definition:
- Triangulation of a set of points:
-
a simplicial complex built on those points such that the underlying space is the convex hull of the points
- Triangulation of $X$:
-
simplicial complex $K$ that is an abstract triangulation such that \(\underbrace{\vert K \vert}_{\text{underlying space}} ≅ X\)
Polytope: convex hull of a finite set of points
\[f(t) = (t, t^2, …, t^d)\]
consider the points $f(t_1), …, f(t_n)$ and \(f(t_1), …, f(t_{d/2})\)
(or any other $d/2$ of them: there are $n^{d/2}$ possibilities): they’re on a face of the convex hull of all these points
Indeed, consider the polynomial
\[0 ≤ (X-t_1)^2 ⋯ (X-t_{d/2})^2 = X^d + α_{d-1} X^{d-1} + ⋯ + α_0\]Then the half-space $α_0 + α_1 x + ⋯ + α_d x^d ≥ 0$ splits the points accordingly.
Crust algorithm, Cocone algorithm (get direction by computing poles for each Voronoi cell, and then discard edges that make an angles too large with this direction)
Voronoi vertices form the medial axis (set of points that have two closest points from the data set)
Reach = the smallest distance between the medial axis and the object (the smaller the reach, the more points are needed to reconsruct the shape)
(cf. pictures)
Biased Randomized Insertion Order (BIO)
Classroom examples of robustness problems in geometric computations: https://people.mpi-inf.mpg.de/~mehlhorn/ftp/classroomExamplesNonrobustness.pdf
Leave a comment