Exercises: Convex optimization

EX 1: Hahn-Banach theorem

Let $C, D$ be two disjoint convex, compact sets in $ℝ^d$.

Show that there exists a separating hyperplane $H$ between $C$ and $D$.

Tip: start by considering points $x∈C$ and $y ∈ D$ s.t. (x, y) = argmin_{(x, y) ∈ C × D} \Vert x -y \Vert^2

(these points exist because of compactness)

We want to show that there exists $a$ s.t.

\begin{cases} C ⊆ \lbrace x ∈ ℝ^d \mid a^T x ≤ 0 \rbrace \\ D ⊆ \lbrace x ∈ ℝ^d \mid a^T x ≥ 0 \rbrace \\ \end{cases}

We’re looking for a hyperlane $H$ s.t.

  • $u ∈ H ⟺ f(u, x, y) = 0$
  • $∀u ∈ C, f(u, x, y) > 0$
  • $∀v ∈ D, f(v, x, y) < 0$

Let

H ≝ \lbrace u \mid (x - y)^T(u - a) = 0 \rbrace

where $a ≝ \frac{x + y}{2} ∉ (C ∪ D)$

(the hyperplane orthogonal to $x-y$ that goes through $a$)

Let’s show that

\begin{cases} ∀u∈C, (x - y)^T(u - a) < 0 \\ ∀u∈D, (x - y)^T(u - a) > 0 \\ \end{cases}

It suffices to prove the second assertion (as the other one is analogous).

By contradiction, assume that there exists $u^\ast ∈ D$ s.t. $(x - y)^T(u^\ast - \frac{x+y}{2}) ≤ 0$

Consider

f(t) ≝ \Vert x-y - t(u^\ast - y) \Vert^2
f'(t) = -2(u^\ast - y)^T(x - y- t(u^\ast - y))

Let’s prove that $f’(0) ≝ -2(u^\ast - y)^T(x - y) ≤ 0$

(u^\ast - y)^T(x - y) = (x - y)^T(u^\ast - y) = \underbrace{(x-y)^T(u^\ast - a)}_{ ≥ 0 \text{ assumption}} + \underbrace{(x-y)^T(a - y)}_{> 0 (y ∈ D)}

So $f’(0) < 0$, and $f$ is locally strictly decreasing around $0$.

So there exists $t^\ast$ in a neighborhood of $0$ s.t.

\Vert x - \underbrace{y - t(u^\ast - y)}_{∈ D \text{ (convex)}} \Vert^2 = f(t^\ast) < f(0) = \Vert x - y \Vert^2

which contradicts the definition of $x$ and $y$.

EX 2: Convexity of usual functions

1. Show that $f(x, y) = \frac{x^2}{y}$ is convex

on a certain set to be indicated.

$f$ is defined on the convex domain $(ℝ × ℝ^\ast_-) ∪ (ℝ × ℝ^\ast_+)$ (union of two convex sets)

Hessian(f)(x, y) ≝ \begin{pmatrix} \frac 2 y & - 2\frac{x}{y^2} \\ - 2\frac{x}{y^2} & 2 \frac{x^2}{y^3} \\ \end{pmatrix} ∈ 𝒮_n(ℝ)

$\vert Hessian(f)(x, y) \vert = 8 \frac{x^2}{y^4} > 0$ for all $x ∈ ℝ, y ∈ ℝ^\ast$

The matrix is positive if $y>0$, and negative otherwise (Gram matrix).

So $f$ is convex on $ℝ ∪ ℝ^\ast_+$

2.

Let $C⊆ ℝ^d$ a convex set.

Convex indicator $I_C$:
I_C(x) ≝ \begin{cases} 0 \text{ if } x ∈ C \\ +∞ \text{ else} \end{cases}

Prove that it’s a convex function.

Check that

I_C(λx + (1-λ)y) ≤ λ I_C(x) + (1-λ) I_C(y)
  • if $(x, y) ∈ C × C$, $λx + (1-λ)y ∈ C ⟹ 0 ≤ 0$
  • if $x ∉ C, y ∉ C$ ⟹ $? ≤ +∞$
  • if $x ∈ C, y ∉ C$ ⟹ $? ≤ +∞$

3. Show that the sup of convex functions remains convex.

$(f_i)_{i∈I}$ family of convex.

It follows from the fact that:

\sup (x + y) ≤ \sup(x) + \sup(y)

(since $x + y ≤ \sup(x) + \sup(y)$)

and

\sup (λx) ≤ λ \sup(x)

(since $x ≤ \sup(x) ⟹ λx ≤ λ\sup(x)$)

4.

Let $𝒮_n$ be the set of symmetric matrices of $𝔐_n(ℝ)$.

Show that the following function is convex:

f: \begin{cases} 𝒮_n ⟶ ℝ \\ M \mapsto \underbrace{λ_{\max}(M)}_{largest eigenvalue} \end{cases}

λ_{\max}(S) = \max_{u ∈ ℝ^n} \frac{u^T S u}{u^T u}

and

g_u ≝ S ⟼ \frac{u^T S u}{u^T u}

is a convex (since linear) function of $S$.

So by 3): $λ_{\max}$ is a convex function

EX 3: Duality

1.

Let $A ⊆ ℝ^{m × n}, b ∈ ℝ^n$.

Consider the Lagrange Problem

\min_{x ∈ ℝ^n} \underbrace{C^T x}_{f(x)}

s.t.

  • $Ax = b$
  • $x ≥ 0 ⟺ - x ≤ 0$

Write the dual LP.

Lagrangian:
ℒ(x, λ, μ) ≝ f(x) + λ^T (Ax - b) - μ^T x
\underbrace{\inf_{x ∈ D} \sup_{λ, μ ∈ ℝ^m × ℝ^r_+} ℒ(x, λ, μ)}_{\text{primal problem}} ≥ \underbrace{\sup_{x ∈ D} \inf_{λ, μ ∈ ℝ^m × ℝ^r_+} ℒ(x, λ, μ)}_{\text{dual problem}}

Dual problem:

\sup_{x ∈ D} \inf_{λ, μ ∈ ℝ^m × ℝ^r_+} \underbrace{C^T x + λ^T (Ax - b) - μ^T x}_{= \; x^T(C^T + A^T λ - μ^T) - λ^T b}
g(λ, μ) = \inf_x \underbrace{C^T x + λ^T (Ax - b) - μ^T x}_{= \; x^T(C^T + A^T λ - μ^T) - λ^T b} = \begin{cases} -∞ \text{ if } (C^T + A^T λ - μ^T) ≠ 0 \\ -λ^T b \text{ else} \end{cases}

Dual problem:

\max_{λ, μ} g(λ, μ)
\begin{cases} \max_λ - λ^Tb \text{ s.t. } C-μ+ A^T λ=0, μ ≥0 \end{cases} ⟺ \begin{cases} \max_λ - λ^Tb \text{ s.t. } C+ A^T λ ≥ 0 \end{cases}

Leave a comment