sparseoptimizationdsp.ee.cuhk.edu.hk/eleg5481/lecture notes/13... · • greedy pursuit generates...

54
Sparse Optimization Wing-Kin (Ken) Ma Department of Electronic Engineering, The Chinese University Hong Kong, Hong Kong ELEG5481, Lecture 13 Acknowledgment: Thank Jiaxian Pan and Xiao Fu for helping prepare the slides.

Upload: others

Post on 04-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Sparse Optimization

Wing-Kin (Ken) Ma

Department of Electronic Engineering,The Chinese University Hong Kong, Hong Kong

ELEG5481, Lecture 13

Acknowledgment: Thank Jiaxian Pan and Xiao Fu for helping prepare the slides.

Page 2: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Introduction

• In signal processing, many problems involve finding a sparse solution.

– compressive sensing– signal separation– recommendation system– direction of arrival estimation– robust face recognition– background extraction– text mining– hyperspectral imaging– MRI– ...

1

Page 3: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Single Measurement Vector Problem

• Sparse single measurement vector (SMV) problem: Given an observationvector y ∈ R

m and a matrix A ∈ Rm×n, find x ∈ R

n such that

y = Ax,

and x is sparsest, i.e., x has the fewest number of nonzero entries.

• We assume that m ≪ n, i.e., the number of observations is much smaller thanthe dimension of the source signal.

2

Page 4: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

measurements

nonzero entries

sparse signal

3

Page 5: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Sparse solution recovery

• We try to recover the sparsest solution by solving

minx

‖x‖0s.t. y = Ax

where ‖x‖0 is the number of nonzero entries of x.

• In the literature, ‖x‖0 is commonly called the “ℓ0-norm”, though it is not anorm.

4

Page 6: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Solving the SMV Problem

• The SMV problem is NP-hard in general.

• An exhaustive search method:

– Fix the support of x ∈ Rn, i.e., determine which entry of x is zero or non-zero.

– Check if the corresponding x has a solution for y = Ax.– By solving all 2n equations, an optimal solution can be found.

• A better way is to use the branch and bound method. But it is still verytime-consuming.

• It is natural to seek approximate solutions.

5

Page 7: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Greedy Pursuit

• Greedy pursuit generates an approximate solution to SMV by recursively buildingan estimate x.

• Greedy pursuit at each iterations follows two essential operations

– Element selection: determine the support I of x (i.e. which elements arenonzero.)

– Coefficient update: Update the coefficient xi for i ∈ I .

6

Page 8: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Orthogonal Matching Pursuit (OMP)

• One of the oldest and simplest greedy pursuit algorithm is the orthogonalmatching pursuit (OMP).

• First, initialize the support I(0) = ∅ and estimate x(0) = 0.

• For k = 1, 2, . . . do

– Element selection: determine an index j⋆ and add it to I(k−1).

r(k) = y − Ax(k−1) (Compute residue r(k)).

j⋆ = arg minj=1,...,n

x

‖r(k) − ajx‖2 (Find the column that reduces residue most)

I(k) = I(k−1) ∪ j⋆ (Add j⋆ to I(k))

– Coefficient update: with support I(k), minimize the estimation residue,

x(k) = arg maxx:xi=0, i/∈I(k)

‖y − Ax‖2.

7

Page 9: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

ℓ1-norm heuristics

• Another method is to approximate the nonconvex ‖x‖0 by a convex function.

minx

‖x‖1s.t. y = Ax.

• The above problem is also known as basis pursuit in the literature.

• This problem is convex (an LP actually).

8

Page 10: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Interpretation as convex relaxation

• Let us start with the original formulation (with a bound on x)

minx

‖x‖0s.t. y = Ax, ‖x‖∞ ≤ R.

• The above problem can be rewritten as a mixed Boolean convex problem

minx,z

1Tz

s.t. y = Ax,

|xi| ≤ Rzi, i = 1, . . . , n

zi ∈ 0, 1, i = 1, . . . , n.

9

Page 11: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Relax zi ∈ 0, 1 to zi ∈ [0, 1] to obtain

minx,z

1Tz

s.t. y = Ax,

|xi| ≤ Rzi, i = 1, . . . , n

0 ≤ zi ≤ 1, i = 1, . . . , n.

• Observing that zi = |xi|/R at optimum , the problem above is equivalent to

minx,z

‖x‖1/R

s.t. y = Ax,

which is the ℓ1-norm heuristic.

• The optimal value of the above problem is a lower bound on that of the originalproblem.

10

Page 12: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Interpretation via convex envelope

• Given a function f with domain C, the convex envelope f env is the largestpossible convex underestimation of f over C, i.e.,

f env(x) = supg(x) | g(x′) ≤ f(x′), ∀ x′ ∈ C, g(x) convex.

• When x is a scalar, |x| is the convex envelope of ‖x‖0 on [−1, 1].

• When x is a vector, ‖x‖1/R is convex envelope of ‖x‖0 on C = x | ‖x‖∞ ≤ R.

11

Page 13: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

ℓ1-norm geometry

(A) (B)

• Fig. A shows the ℓ1 ball of some radius r in R2. Note that the ℓ1 ball is “pointy”

along the axes.

• Fig. B shows the ℓ1 recovery problem. The point x is a “sparse” vector; the lineH is the set of x that shares the same measurement y.

12

Page 14: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

ℓ1-norm geometry

• The ℓ1 recovery problem is to pick out a point in H that has the minimum ℓ1norm. We can see that x is such a point.

• Fig. C shows the geometry when ℓ2 norm is used instead of ℓ1 norm. We cansee that the solution x may not be sparse.

13

Page 15: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Recovery guarantee of ℓ1-norm minimization

• When ℓ1-norm minimization is equivalent to ℓ0-norm minimization?

• Sufficient conditions are provided by characterizing the structure of A and thesparsity of the desirable x.

– Example: Let µ(A) = maxi 6=j|aTi aj|

‖ai‖2‖aj‖2which is called the mutual coherence.

If there exists an x such that y = Ax and

µ(A) ≤ 12‖x‖0−1,

then x is the unique solution of ℓ1-norm minimization. It is also the solutionof the corresponding ℓ0-norm minimization.

– Such mutual coherence condition means that sparser x and “moreorthonormal” A provide better chance of perfect recovery by ℓ1-normminimization.

• Other conditions: restricted isometry property (R.I.P.) condition, null spaceproperty, ...

14

Page 16: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Recovery guarantee of ℓ1-norm minimization

There are several other variations.

• Basis pursuit denoising

min ‖x‖1s.t. ‖y −Ax‖2 ≤ ǫ.

• Penalized least squares

min ‖Ax− b‖22 + λ‖x‖1.

• Lasso Problem

min ‖Ax− b‖22s.t. ‖x‖1 ≤ τ.

15

Page 17: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Application: Sparse signal reconstruction

• Sparse signal x ∈ Rn with n = 2000 and ‖x‖0 = 50.

• m = 400 noise-free observations of y = Ax, where Aij ∼ N (0, 1).

0 500 1000 1500 2000−30

−20

−10

0

10

20

30

Sparse source signal

0 500 1000 1500 2000−30

−20

−10

0

10

20

30

Perfect recovery by ℓ1-norm minimization

16

Page 18: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

0 500 1000 1500 2000−30

−20

−10

0

10

20

30

Sparse source signal

0 500 1000 1500 2000−30

−20

−10

0

10

20

30

Estimated by ℓ2-norm minimization

17

Page 19: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Sparse signal x ∈ Rn with n = 2000 and ‖x‖0 = 50.

• m = 400 noisy observations of y = Ax+ ν, where Aij ∼ N (0, 1) andνi ∼ N (0, δ2).

• Basis pursuit denoising is used.

• δ2 = 100 and ǫ =√mδ2.

0 500 1000 1500 2000−20

−15

−10

−5

0

5

10

15

20

Sparse source signal

0 500 1000 1500 2000−15

−10

−5

0

5

10

15

20

Estimated by ℓ1-norm minimization

18

Page 20: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

0 500 1000 1500 2000−20

−15

−10

−5

0

5

10

15

20

Sparse source signal

0 500 1000 1500 2000−25

−20

−15

−10

−5

0

5

10

15

20

25

Estimated by ℓ2-norm minimization

19

Page 21: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Application: Compressive sensing (CS)

• Consider a signal x ∈ Rn that has a sparse representation x ∈ R

n in the domainof Ψ ∈ R

n×n (e.g. FFT and wavelet), i.e.,

x = Ψx.

where x is sparse.

The pirate image x The wavelet transform x

20

Page 22: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• To acquire information of the signal x, we use a sensing matrix Φ ∈ Rm×n to

observe xy = Φx = ΦΨx.

Here, we have m ≪ n, i.e., we only obtain very few observations compared tothe dimension of x.

• Such a y will be good for compression, transmission and storage.

• x is recovered by recovering x:

min ‖x‖0s.t. y = Ax,

where A = ΦΨ.

21

Page 23: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Application: Total Variation-based Denoising

• Scenario:

– We want to estimate x ∈ Rn from a noisy measurement xcor = x+ n.

– x is known to be piecewise linear, i.e., for most i we have

xi − xi−1 = xi+1 − xi ⇐⇒ −xi+1 + 2xi − xi+1 = 0.

– Equivalently, Dx is sparse, where

D =

−1 2 1 0 . . .0 −1 2 1 . . .... ... ... ... .... . . . . . −1 2 1

.

• Problem formulation: x = argminx ‖xcor − x‖2 + λ‖Dx‖0.

• Heuristic: change ‖Dx‖0 to ‖Dx‖1.

22

Page 24: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

0 200 400 600 800 1000 1200 1400 1600

−5

0

5

0 200 400 600 800 1000 1200 1400 1600

−5

0

5

Source

Corrupted by noise

Original x and corrupted xcor

23

Page 25: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

0 200 400 600 800 1000 1200 1400 1600−5

0

5

0 200 400 600 800 1000 1200 1400 1600−5

0

5

0 200 400 600 800 1000 1200 1400 1600−5

0

5

x with λ = 0.1

x with λ = 1

x with λ = 10

Denoised signals with different λ’s and by x = argminx ‖xcor − x‖2 + λ‖Dx‖1.

24

Page 26: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

0 200 400 600 800 1000 1200 1400 1600−5

0

5

0 200 400 600 800 1000 1200 1400 1600−5

0

5

0 200 400 600 800 1000 1200 1400 1600−5

0

5

x with λ = 0.1

x with λ = 1

x with λ = 10

Denoised signals with different λ’s and by x = argminx ‖xcor − x‖2 + λ‖Dx‖2.

25

Page 27: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Matrix Sparsity

The notion of sparsity for a matrix X may refer to several different meanings.

• Element-wise sparsity: ‖vec(X)‖0 is small.

• Row sparsity: X only has a few nonzero rows.

• Rank sparsity: rank(X) is small.

26

Page 28: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Row sparsity

• Let X = [x1, . . . , xp]. Row sparsity means that each xi shares the same support.

measurementsnonzero rows

sparse signal

27

Page 29: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Row sparsity

• Multiple measurement vector (MMV) problem

minX

‖X‖row-0

s.t. Y = AX,

where ‖X‖row-0 denote the number of nonzero rows.

• Empirically, MMV works (much) better than SMV in many applications.

28

Page 30: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Mixed-norm relaxation approach:

minX

‖X‖pq,ps.t. Y = AX,

where ‖X‖q,p = (∑m

i=1 ‖xi‖pq)(1/p) and xi denotes the ith row in X .

• For q ∈ [1,∞] and p = 1, this is a convex problem.

• For (p, q) = (1, 2), this problem can be formulated as an SOCP

mint,X

m∑

i=1

ti

s.t. Y = AX

‖xi‖2 ≤ ti, i = 1, . . . ,m.

29

Page 31: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Some variations:

min ‖X‖2,1s.t. ‖Y −AX‖F ≤ ǫ.

min ‖AX − Y ‖2F + λ‖X‖2,1

min ‖AX − b‖2Fs.t. ‖X‖2,1 ≤ τ.

• Other algorithms: Simultaneously Orthogonal Matching Pursuit (SOMP),Compressive Multiple Signal Classification (Compressive MUSIC), Nonconvexmixed-norm approach (by choosing 0 < p < 1), ...

30

Page 32: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Application: Direction-of-Arrival (DOA) estimation

... ...

...S1,t S2,t Sk−1,t Sk,t

Y1,t Y2,t Ym−1,t Ym,t

θ1θ2 θk−1

θk

dd

31

Page 33: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Application: Direction-of-Arrival (DOA) estimation

• Considering t = 1, . . . , p, the signal model is

Y = A(θ)S +N,

where

A(θ) =

1 . . . 1

e−j2πdγ sin(θ1) . . . e−

j2πdγ sin(θm)

... ... ...

e−j2πdγ (n−1) sin(θ1) . . . e−

j2πdγ (n−1) sin(θm)

Y ∈ Rm×p are received signals, S ∈ R

k×p sources, N ∈ Rm×p noise, m and k

number of receivers and sources, and γ is the wavelength.

• Objective: estimate θ = [θ1, . . . , θk]T , where θi ∈ [−90, 90] for i = 1, . . . , k.

32

Page 34: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Construct

A = [ a(−90), a(−89), a(−88), . . . , a(88), a(89), a(90) ],

where a(θ) = [ 1, e−j2πdγ sin(θ), . . . , e−

j2πdγ sin(θ) ]T .

• By such construction, we have

A(θ) = [ a(θ1), . . . , a(θk) ],

is approximately a submatrix of A.

• DOA estimation is approximately equivalent to finding the columns of A(θ) inA.

• Discretizing [−90, 90] to more dense grids may increase the estimation accuracywhile require more computation resources.

33

Page 35: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Example: k = 3, θ = [−90,−88, 88].

...

...

Y =

A X

[S1,1, . . . , S1,p]

[S2,1, . . . , S2,p]

[S3,1, . . . , S3,p]a(−90) a(−88) a(88)

• To locate the “active columns” in A is equivalent to find a row-sparse X .

• Problem formulation:

minX

‖Y − AX‖2F + λ‖X‖2,1.

34

Page 36: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Simulation: k = 3, p = 100, n = 8 and SNR= 30dB; three sources come from−65, −20 and 42, respectively. A = [ a(−90), a(−89.5), . . . , a(90) ] ∈R

m×381.

−80 −60 −40 −20 0 20 40 60 800

1

2

3

4

5

6

7

8

9

10

11

Angle (in Degree)

Cor

resp

ondi

ng 2

−no

rm o

f row

of X

True Angle θ=[−65, −20, 42]T Values of |X|

10 20 30 40 50 60 70 80 90 100

50

100

150

200

250

300

350

0.5

1

1.5

2

2.5

35

Page 37: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Application: Library-based Hyperspectral Image Separation

36

Page 38: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Consider a hyperspectral image (HSI) captured by a remote sensor (satellite,aircraft, etc.).

• Each pixel of HSI is an m-dimensional vector, corresponding to spectral info. ofm bands.

• The spectral shape can be used for classifying materials on the ground.

• During the process of image capture, the spectra of different materials might bemixed in pixels.

37

Page 39: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Signal Model:Y = BS +N,

where Y ∈ Rm×p is HSI with p pixels, B = [b1, . . . , bk] ∈ R

m×k are spectra of

materials, S ∈ Rk×p+ , Si,j represents the amount of material i in pixel j, and N

is the noise.

• To know what materials are in pixels, we need to estimate B and S.

38

Page 40: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• There are spectral libraries providing spectra of more than a thousand materials.

spectral bands

refe

ctan

ce(%

)

Alunite

spectral bands

refe

ctan

ce(%

)

Amonium Smectite

spectral bands

refe

ctan

ce(%

)

Andradite

spectral bands

refe

ctan

ce(%

)

Buddingtonite

spectral bands

refe

ctan

ce(%

)

Chalcedony

spectral bands

refe

ctan

ce(%

)

Desert Varnish

spectral bands

refe

ctan

ce(%

)

Dumor

spectral bands

refe

ctan

ce(%

)

Kaolinite 1

spectral bands

refe

ctan

ce(%

)

Kaolinite 2

spectral bands

refe

ctan

ce(%

)

Montmorillonite 1

spectral bands

refe

ctan

ce(%

)

Montmorillonite 2

spectral bands

refe

ctan

ce(%

)

Muscovite

spectral bands

refe

ctan

ce(%

)

Nontronite 1

spectral bands

refe

ctan

ce(%

)

Nontronite 2

spectral bands

refe

ctan

ce(%

)

Nontronite 3

spectral bandsre

fect

ance

(%)

Nontronite 4

spectral bands

refe

ctan

ce(%

)

Paragonite

spectral bands

refe

ctan

ce(%

)

Pyrope

Some recorded spectra of minerals in U.S.G.S library.

• In many cases, an HSI pixel can be considered as a mixture of 3 to 5 spectra ina known library, which records hundreds of spectra.

39

Page 41: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Example: Suppose that B = [b1, b2, b3] is a submatrix of a known library A.Again, we have

...

...

Y =

A X[S1,1, . . . , S1,p]

[S2,1, . . . , S2,p]

[S3,1, . . . , S3,p]b1 b2 b3

• Estimation of B and S can be done via finding the row-sparse X .

• Problem formulation:

minX≥0

‖Y −AX‖2F + λ‖X‖2,1,

where the non-negativity of X is added for physical consistency (since elementsof S represent amounts of materials in a pixel.)

40

Page 42: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Simulation: we employ the pruned U.S. Geological Survey (U.S.G.S.) library withn = 342 spectra vectors; each spectra vector has m = 224 elements; the syntheticHSI consists of k = 4 selected materials from the same library; number of pixelsp = 1000; SNR=40dB.

50 100 150 200 250 3000

1

2

3

4

5

6

7

8

Index of library spectra

2−no

rms

of r

ows

of th

e es

timat

ed X

Ground truth material indices: [8, 56, 67, 258] Values of elements of the estimated |X|

10 20 30 40 50 60 70 80 90 100

50

100

150

200

250

300

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

41

Page 43: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Rank sparsity

• Rank minimization problem

minX

rank(X)

s.t. A(X) = Y,

where A is a linear operator (i.e., A× vec(X) = vec(Y ) for some matrix A).

• When X is restricted to be diagonal, rank(X) = ‖diag(X)‖0 and the rankminimization problem reduces to the SMV problem.

• Therefore, the rank minimization problem is more general (and more difficult)than the SMV problem.

42

Page 44: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• The nuclear norm ‖X‖∗ is defined as the sum of singular values, i.e.

‖X‖∗ =r

i=1

σi.

• The nuclear norm is the convex envelope of the rank function on the convex setX | ‖X‖2 ≤ 1.

• This motivates us to use nuclear norm to approximate the rank function.

minX

‖X‖∗

s.t. A(X) = Y.

• Perfect recovery is guaranteed if certain properties hold for A.

43

Page 45: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• It can be shown that the nuclear norm ‖X‖∗ can be computed by an SDP

‖X‖∗ = minZ1,Z2

12tr(Z1 + Z2)

s.t.

[

Z1 XXT Z2

]

0.

• Therefore, the nuclear norm approximation can be turned to an SDP

minX,Z1,Z2

12tr(Z1 + Z2)

s.t. Y = A(X)[

Z1 XXT Z2

]

0.

44

Page 46: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Application: Matrix Completion Problem

• Recommendation system: recommend new movies to users based on theirprevious preference.

• Consider a preference matrix Y with yij representing how user i likes movie j.

• But some yij are unknown since no one watches all movies

• We would like to predict how users like new movies.

• Y is assumed to be of low rank, as researches show that only a few factors affectusers’ preferences.

movies

Y =

2 3 1 ? ? 5 51 ? 4 2 ? ? ?? 3 1 ? 2 2 2? ? ? 3 ? 1 52 ? 4 ? ? 5 3

users

45

Page 47: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Low rank matrix completion

min rank(X)

s.t. xij = yij, for (i, j) ∈ Ω,

where Ω is the set of observed entries.

• Nuclear norm approximation

min ‖X‖∗s.t. xij = yij, for (i, j) ∈ Ω.

46

Page 48: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Low-rank matrix + element-wise sparse corruption

• Consider the signal modelY = X + E

where Y is the observation, X a low-rank signal, and E some sparse corruptionwith |Eij| arbitrarily large.

3 3

3

3 3

3 3 3

3 3

3 3

3

3 3

3 3

3

3

3 3 3

3

3 3

3

3 3

3 3 3

3 3

3 3

3

3 3

3 3

3

3

3 3 3

3

• The objective is to separate X from E via Y .

47

Page 49: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Simultaneous rank and element-wise sparse recovery

min rank(X) + γ‖vec(E)‖0s.t. Y = X + E,

where γ ≥ 0 is used for balancing rank sparsity and element-wise sparsity.

• Replacing rank(X) by ‖X‖∗ and ‖vec(E)‖0 by ‖vec(E)‖1, we have a convexproblem:

min ‖X‖∗ + γ‖vec(E)‖1s.t. Y = X + E.

• A theoretical result indicates that when X is of low-rank and E is sparse enough,exact recovery happens with very high probability.

48

Page 50: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Application: Background extraction

• Suppose that we are given video sequences Fi, i = 1, . . . , p.

• Our objective is to exact the background in the video sequences.

• The background is of low-rank, as the background is static within a short periodof time.

• The foreground is sparse, as activities in the foreground only occupy a smallfraction of space.

49

Page 51: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• Stacking the video sequences Y = [vec(F1), . . . , vec(Fp)], we have

Y = X + E,

where X represents the low-rank background, and E the sparse foreground.

• Nuclear norm and ℓ1-norm approximation:

min ‖X‖∗ + γ‖vec(E)‖1s.t. Y = X + E.

50

Page 52: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

• 500 images, image size 160× 128, γ = 1/√160× 128.

• Row 1: the original video sequences.

• Row 2: the extracted low-rank background.

• Row 3: the extracted sparse foreground.

51

Page 53: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Low-rank matrix + sparse corruption + dense noise• A more general model

Y = A(X + E) + V,

where X is low-rank, E sparse corruption, V dense but small noise, and A(·) alinear operator.

• Simultaneous rank and element-wise sparse recovery with denoising

minX,E,V

rank(X) + γ‖vec(E)‖0 + λ‖V ‖F

s.t. Y = A(X + E) + V.

• Convex approximation

minX,E,V

‖X‖∗ + γ‖vec(E)‖1 + λ‖V ‖F

s.t. Y = A(X + E) + V.

• A final remark: In sparse optimization, problem dimension is usually very large.You probably need fast custom-made algorithms instead of relying on CVX.

52

Page 54: SparseOptimizationdsp.ee.cuhk.edu.hk/eleg5481/Lecture notes/13... · • Greedy pursuit generates an approximate solution to SMV by recursively building an estimate xˆ. • Greedy

Reference

S. Boyd, “ℓ1-norm methods for convex-cardinality problems”, lecture note ofEE364b, Standford university.

J Romberg, “Imaging via compressive sensing”, Signal Processing Magazine, 2008.

Y. Ma, A. Yang, and J. Wright, “Sparse Representation and low-rankrepresentation”, tutorial note in European conference on computer vision, 2012

Y. C. Eldar and G. Kutyniok, “Compressed Sensing: Theory and Applications”,Cambridge University Press, 2012.

J. Chen and X. Huo, “Theoretical Results on Sparse Representations of Multiple-Measurement Vectors”, IEEE Trans. Signal Process., 2006

D. Malioutov, M. Cetin, and A. Willsky, “A sparse signal reconstruction perspectivefor source localization with sensor arrays”, IEEE Trans. Signal Process., 2005.

M.-D. Iordache, J. Bioucas-Dias, and A. Plaza, “Collaborative Sparse Regression forHyperspectral Unmixing”, accepted for publication in IEEE Trans. Geosci. Remote

Sens., 2013.

53