sensing via dimensionality reduction structured sparsity models volkan cevher [email protected]

66
Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher [email protected]

Upload: brett-noah-tate

Post on 25-Dec-2015

223 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Sensing via Dimensionality ReductionStructured Sparsity Models

Volkan [email protected]

Page 2: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Sensors

160MP 200,000fps 192,000Hz 2009 - Real time

1977 - 5hours

Page 3: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Digital Data Acquisition

Foundation: Shannon/Nyquist sampling theorem

time space

“if you sample densely enough (at the Nyquist rate), you can perfectly reconstruct the original analog data”

Page 4: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Major Trends in Sensing

higher resolution / denser sampling

large numbers of sensors

increasing # of modalities / mobility

Page 5: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Major Trends in Sensing

Motivation: solve bigger / more important problems

decrease acquisition times / costs

entertainment…

Page 6: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Problems of the Current Paradigm

• Sampling at Nyquist rate

– expensive / difficult

• Data deluge

– communications / storage

• Sample then compress

– not future proof

Page 7: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Approaches

• Do nothing / Ignore

be content with where we are…

– generalizes well

– robust

Page 8: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Approaches

• Finite Rate of Innovation

Sketching / Streaming

Compressive Sensing

[Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]

Page 9: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Approaches

• Finite Rate of Innovation

Sketching / Streaming

Compressive Sensing

[Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]

PARSITY

Page 10: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Agenda

• A short review of compressive sensing

• Beyond sparsity

– Structure incorporated…

• Source localization application

• Conclusions

Page 11: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Compressive Sensing 101

• Goal: Recover a sparse orcompressible signal from measurements

• Problem: Randomprojection not full rank

• Solution: Exploit the sparsity/compressibilitygeometry of acquired signal

Page 12: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

• Goal: Recover a sparse orcompressible signal from measurements

• Problem: Randomprojection not full rankbut satisfies Restricted Isometry Property (RIP)

• Solution: Exploit the sparsity/compressibility geometry of acquired signal

– iid Gaussian– iid Bernoulli– …

Compressive Sensing 101

Page 13: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

• Goal: Recover a sparse orcompressible signal from measurements

• Problem: Randomprojection not full rank

• Solution: Exploit the modelgeometry of acquired signal

Compressive Sensing 101

Page 14: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

• Sparse signal: only K out of N coordinates nonzero

– model: union of K-dimensional subspacesaligned w/ coordinate axes

Concise Signal Structure

sorted index

Page 15: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

• Sparse signal: only K out of N coordinates nonzero

– model: union of K-dimensional subspaces

• Compressible signal: sorted coordinates decay rapidly to zero

– Model: weak ball

Concise Signal Structure

sorted index

power-lawdecay

Page 16: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

• Sparse signal: only K out of N coordinates nonzero

– model: union of K-dimensional subspaces

• Compressible signal: sorted coordinates decay rapidly to zero

well-approximated by a K-sparse signal(simply by thresholding)

sorted index

Concise Signal Structure

Page 17: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Restricted Isometry Property (RIP)• Preserve the structure of sparse/compressible signals

• RIP of order 2K implies: for all K-sparse x1 and x2

K-planes

Page 18: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Restricted Isometry Property (RIP)• Preserve the structure of sparse/compressible signals

• Random subGaussian (iid Gaussian, Bernoulli) matrix has the RIP with high probability if

K-planes

Page 19: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Recovery Algorithms• Goal:given

recover

• and convex optimization formulations– basis pursuit, Dantzig selector, Lasso, …

• Greedy algorithms– orthogonal matching pursuit,

iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP)

– at their core: iterative sparse approximation

Page 20: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Performance of Recovery

• Using methods, IT, CoSaMP

• Sparse signals

– noise-free measurements: exact recovery – noisy measurements: stable recovery

• Compressible signals

– recovery as good as K-sparse approximation

CS recoveryerror

signal K-termapprox error

noise

Page 21: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

From Sparsity to

Structured Sparsity

Page 22: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Sparse Models

wavelets:natural images

Gabor atoms:chirps/tones

pixels:background subtracted

images

Page 23: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Sparse Models • Sparse/compressible signal model captures

simplistic primary structure

sparse image

Page 24: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Beyond Sparse Models • Sparse/compressible signal model captures

simplistic primary structure

• Modern compression/processing algorithms capture richer secondary coefficient structure

wavelets:natural images

Gabor atoms:chirps/tones

pixels:background subtracted

images

Page 25: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Sparse Signals

• Defn: K-sparse signals comprise a particular set of K-dim canonical subspaces

Page 26: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Model-Sparse Signals

• Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces

Page 27: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Model-Sparse Signals

• Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces

• Structured subspaces

<> fewer subspaces

<> relaxed RIP

<> fewer measurements

Page 28: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Model-Sparse Signals

• Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces

• Structured subspaces

<> increased signal discrimination

<> improved recovery perf.

<> faster recovery

Page 29: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Model-based CS

Running Example: Tree-Sparse Signals

[Baraniuk, VC, Duarte, Hegde]

Page 30: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Wavelet Sparse

• Typical of wavelet transformsof natural signals and images (piecewise smooth)

1-D signals 1-D wavelet transform

scale

scale

coefficientstime

am

plitu

de

am

plitu

de

Page 31: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Tree-Sparse

• Model: K-sparse coefficients + significant coefficients

lie on a rooted subtree

• Typical of wavelet transformsof natural signals and images (piecewise smooth)

Page 32: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Tree-Sparse

• Model: K-sparse coefficients + significant coefficients

lie on a rooted subtree

• Sparse approx: find best set of coefficients

– sorting– hard thresholding

• Tree-sparse approx: find best rooted subtree of coefficients

– CSSA [Baraniuk]

– dynamic programming [Donoho]

Page 33: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

• Model: K-sparse coefficients

• RIP: stable embedding

Sparse

K-planes

Page 34: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Tree-Sparse• Model: K-sparse coefficients

+ significant coefficients lie on a rooted subtree

• Tree-RIP: stable embedding

K-planes

Page 35: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Tree-Sparse• Model: K-sparse coefficients

+ significant coefficients lie on a rooted subtree

• Tree-RIP: stable embedding

• Recovery: new model based algorithms [VC, Duarte, Hegde, Baraniuk; Baraniuk, VC, Duarte,

Hegde]

Page 36: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

• Iterative Thresholding

Standard CS Recovery

[Nowak, Figueiredo; Kingsbury, Reeves; Daubechies, Defrise, De Mol; Blumensath, Davies; …]

update signal estimate

prune signal estimate(best K-term approx)

update residual

Page 37: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

• Iterative Model Thresholding

Model-based CS Recovery

[VC, Duarte, Hegde, Baraniuk; Baraniuk, VC, Duarte, Hegde]

update signal estimate

prune signal estimate(best K-term model approx)

update residual

Page 38: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Tree-Sparse Signal Recovery

target signal CoSaMP, (MSE=1.12)

L1-minimization(MSE=0.751)

Tree-sparse CoSaMP (MSE=0.037)

N=1024M=80

Page 39: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Compressible Signals

• Real-world signals are compressible, not sparse

• Recall: compressible <> well approximated by sparse

– compressible signals lie close to a union of subspaces– ie: approximation error decays rapidly as

• If has RIP, thenboth sparse andcompressible signalsare stably recoverable

sorted index

Page 40: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Model-Compressible Signals

• Model-compressible <> well approximated by model-sparse

– model-compressible signals lie close to a reduced union of subspaces

– ie: model-approx error decays rapidly as

Page 41: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Model-Compressible Signals

• Model-compressible <> well approximated by model-sparse

– model-compressible signals lie close to a reduced union of subspaces

– ie: model-approx error decays rapidly as

• While model-RIP enables stablemodel-sparse recovery, model-RIP is not sufficient for stable model-compressible recovery at !

Page 42: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Stable Recovery• Stable model-compressible signal recovery at

requires that have both:– RIP + Restricted Amplification Property

• RAmP: controls nonisometry of in the approximation’s residual subspaces

optimal K-termmodel recovery(error controlled

by RIP)

optimal 2K-termmodel recovery(error controlled

by RIP)

residual subspace(error not controlled

by RIP)

Page 43: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Tree-RIP, Tree-RAmP

Theorem: An MxN iid subgaussian random matrix has the Tree(K)-RIP if

Theorem: An MxN iid subgaussian random matrix has the Tree(K)-RAmP if

Page 44: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Simulation

• Number samples for correct recovery

• Piecewise cubic signals +wavelets

• Models/algorithms:– compressible

(CoSaMP)– tree-compressible

(tree-CoSaMP)

Page 45: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Performance of Recovery

• Using model-based IT, CoSaMP with RIP and RAmP

• Model-sparse signals– noise-free measurements: exact recovery – noisy measurements: stable recovery

• Model-compressible signals

– recovery as good as K-model-sparse approximation

CS recoveryerror

signal K-termmodel approx error

noise

[Baraniuk, VC, Duarte, Hegde]

Page 46: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Other Useful Models

• When the model-based framework makes sense:– model with

fast approximation algorithm– sensing matrix with

model-RIP model-RAmP

• Ex: block sparsity / signal ensembles[Tropp, Gilbert, Strauss], [Stojnic, Parvaresh, Hassibi], [Eldar, Mishali], [Baron, Duarte et al], [Baraniuk, VC, Duarte, Hegde]

• Ex: clustered signals[VC, Duarte, Hegde, Baraniuk], [VC, Indyk, Hegde, Baraniuk]

• Ex: neuronal spike trains [Hegde, Duarte, VC] – Best paper award

Page 47: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Block-Sparse Signal

target CoSaMP (MSE = 0.723)

block-sparse model recovery (MSE=0.015) Blocks are pre-specified.

Page 48: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Block-Compressible Signal

target CoSaMP (MSE=0.711)

block-sparse recovery (MSE=0.195)

best 5-block approximation (MSE=0.116 )

Page 49: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Clustered Sparsity

• (K,C) sparse signals (1-D)– K-sparse within at most C clusters

• For stable recovery (model-RIP + RAmP)

• Model approximation using dynamic programming

• Includes block sparsity as as a special case

[VC, Indyk, Hedge, Baraniuk]

Page 50: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Clustered Sparsity

target Ising-modelrecovery

CoSaMPrecovery

LP (FPC)recovery

• Model clustering of significant pixels in space domain using graphical model (MRF)

• Ising model approximation via graph cuts[VC, Duarte, Hedge, Baraniuk]

Page 51: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Neuronal Spike Trains

• Model the firing process of a single neuron via 1D Poisson process with spike trains

- Exploit the refractory period of neurons

• Model approximation problem:

- Find a K-sparse signal such that its coefficients are

separated by at least

Page 52: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Neuronal Spike Trains

• Model the firing process of a single neuron via 1D Poisson process with spike trains

– Stable recovery

• Model approximation solution:

– Integer program

– Efficient & provable solution due to total unimodularity of linear constraint

[Hedge, Duarte, VC; Best Student Paper Award at SPARS’09]

Page 53: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu
Page 54: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Signal recovery is not always required.

ELVIS:

Enhanced Localization via Incoherence and Sparsity

Page 55: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Localization Problem

• Goal: Localize targetsby fusing measurementsfrom a network of sensors

[VC, Duarte, Baraniuk; Model and Zibulevsky; VC, Gurbuz, McClellan, Chellappa; Malioutov, Cetin, and Willsky; Chen et al.]

Page 56: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Localization Problem

• Goal: Localize targetsby fusing measurementsfrom a network of sensors

– collect time signal data– communicate signals across

the network– solve an optimization

problem

Page 57: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Bottlenecks

• Goal: Localize targetsby fusing measurementsfrom a network of sensors

– collect time signal data requires potentially

high-rate (Nyquist)sampling

– communicate signalsacross the network potentially large

communicationburden

– solve an optimizationproblem

Need compression

Page 58: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

An Important Detail

• Solve two entangled problems for localization

– Estimate source locations

– Estimate source signals

Page 59: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

ELVIS

• Instead, solve one localization problem

– Estimate source locations by exploiting random projections of

observed signals– Estimate source signals

Page 60: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

ELVIS

• Instead, solve one localization problem

– Estimate source locations by exploiting random projections of

observed signals– Estimate source signals

• Bayesian model order selection & MAP estimation results in a decentralized sparse approximation framework that leverages

– Source sparsity

– Incoherence of sources

– Spatial sparsity of sources

[VC, Boufounos, Baraniuk, Gilbert, Strauss]

Page 61: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

ELVIS

• Use random projections of observed signals two ways:

– Create local sensor dictionaries that sparsify source locations

– Create intersensor communication messages

(K targets on N-dim grid)

populatedusing recovered

signalsrandom iid

Page 62: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

ELVIS

• Use random projections of observed signals two ways:

– Create local sensor dictionaries that sparsify source locations

sample at source sparsity– Create intersensor communication

messagescommunicate at spatial sparsityrobust to (i) quantization

(ii) packet drops

No Signal Reconstruction

ELVIS Dictionary

Page 63: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

ELVIS

• Use random projections of observed signals two ways:

– Create local sensor dictionaries that sparsify source locations

sample at source sparsity– Create intersensor communication

messagescommunicate at spatial sparsityrobust to (i) quantization

(ii) packet drops

• Provable greedy estimation for ELVIS dictionaries

Bearing pursuit

No Signal Reconstruction

Page 64: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Field Data Results5 vehicle convoy

>100 × sub-Nyquist

Page 65: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Conclusions

• Why CS works: stable embedding for signals with concise geometric structure

• Sparse signals >> model-sparse signals• Compressible signals >> model-compressible

signals• Greedy model-based signal recovery algorithms

upshot: provably fewer measurementsmore stable recovery

new concept: RIP >> RAmP

• ELVIS: localization via dimensionality reduction

Page 66: Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher volkan@rice.edu

Volkan Cevher / [email protected]