Computational Geometry Learning: Introduction

Teacher: Marc Glisse

  • Computational geometry and topology
  • Triangulations, simplicial complexes
  • Algorithms in high dimensions
  • Shape reconstruction
  • Geometric inference
  • Topological data analysis

Convex sets

  • Stable by

    • intersection
    • increasing union
  • Existence of tangent lines at point on the frontier

Convex hull (CH) of a set of points (set of barycenters of those points, or intersection of all convex sets containing those points)

Helly theorem

We’re in $ℝ^d$.

Radon: Consider $d+2$ points. There exist a partition $P \sqcup >Q$ of these points such that CH(P) ∩ CH(Q) ≠ ∅

Proof: $d+2 > d+1$ so the points where you add a new coordinate equal to $1$ (at the end) are linearly dependent:

\sum\limits_{ i } a_i p_i = 0\\ \sum\limits_{ i } a_i = 0

Rewrite the previous lines:

\sum\limits_{ a_i > 0 } a_i p_i = \sum\limits_{ a_i < 0 } (-a_i) p_i\\ \sum\limits_{ a_i > 0 } a_i = \sum\limits_{ a_i < 0 } (-a_i)\\


\frac{\sum\limits_{ a_i > 0 } a_i p_i}{\sum\limits_{ a_i > 0 } a_i} = \frac{\sum\limits_{ a_i < 0 } (-a_i) p_i}{\sum\limits_{ a_i < 0 } (-a_i)}

and this point is a barycenter of the two sets of points (on the left and on the right), so it’s part the convex hull of these two sets.

Helly theorem: Consider $n>d$ convex sets in $ℝ^d$.

If they all $d+1$ of them intersect, there is a common intersection point for all of them.

Proof: We have $d+2$ convex sets $S_1, …, S_{d+2}$

For all $i$, pick a

p_i ∈ \bigcap\limits_{j≠i} S_j

(it exists by hypothesis)

By Radon, there exists:

c ∈ CH(P) ∩ CH(Q)

We will show that $c ∈ S_i \quad ∀ i$

For all $i$:

∀ j ≠ i, \qquad p_j ∈ S_i

So if $p_i ∈ Q$, then $∀ p ∈ P, p ≠ p_i$, so $p ∈ S_i$, and

P ⊆ S_i


CH(P) ⊆ S_i

so $c ∈ CH(P) ⊆ S_i$.


p ∈ CH(P) ⟹ p ∈ CH(d+1 \text{ points of } P)

Scan/Graham-Schmidt algorithm: to compute the upper part of the convex hull of a set of point in 2D.

Complexity: $n \log n$ (sorting the points) + $O(n)$ (going through) + $O(n)$ (amortized backtrack)

Lower bound for CH computation: $n \log n$, because if you know how to compute the convex hull, you know how to sort numbers: consider the CH of the parabola $(x_i, x^2_i)$

Jarvis algorithm/gift wrapping: in $O(n)$, but the constant may be big, in which case the Graham-Schmidt algorithm may be more advantageous

  • In a $d$-dimensional ball: pick $n$ points uniformly at random: how many points will land on the CH? ⟶ $n^{\frac{d-1}{d+1}}$

  • Same in a $d$-dimensional square ⟶ $\log^{d-1} n$

Divide and conquer: $O(n \log n)$ Quickhull: $O(n^2)$ in worst case

Timothy Chan’s algorithm: $(n + h \frac n m) \log m$ ⟶ with $m = h$: $2n \log h$

Linear programming: in $O(n)$ in dimension $d$, but warning: the constant is exponential in the dimension

Leave a comment