engg5781 matrix analysis and computations lecture 0: overviewkma/engg5781/new_notes/lecture... ·...

37
ENGG5781 Matrix Analysis and Computations Lecture 0: Overview Wing-Kin (Ken) Ma 2019–2020 Term 1 Department of Electronic Engineering The Chinese University of Hong Kong

Upload: others

Post on 31-May-2020

19 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

ENGG5781 Matrix Analysis and ComputationsLecture 0: Overview

Wing-Kin (Ken) Ma

2019–2020 Term 1

Department of Electronic EngineeringThe Chinese University of Hong Kong

Page 2: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Course Information

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 1

Page 3: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

General Information

• Instructor: Wing-Kin Ma

– office: SHB 323– e-mail: [email protected]

• Lecture hours and venue:

– Monday 11:30am–12:15pm, Lee Shau Kee Architecture Building G03

– Tuesday 4:30pm–6:15pm, Wu Ho Man Yuen Building 507

• Class website: http://www.ee.cuhk.edu.hk/~wkma/engg5781

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 2

Page 4: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Course Contents

• This is a foundation course on matrix analysis and computations, which are widelyused in many different fields, e.g.,

– machine learning, computer vision,– systems and control, signal and image processing, communications, networks,– optimization, and many more...

• Aim: covers matrix analysis and computations at an advanced or research level.

• Scope:

– basic matrix concepts, subspace, norms,– linear least squares– eigendecomposition, singular value decomposition, positive semidefinite matri-

ces,– linear system of equations, LU decomposition, Cholesky decomposition– pseudo-inverse, QR decomposition– (advanced) tensor decomposition, advanced matrix calculus, compressive sens-

ing, structured matrix factorization

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 3

Page 5: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Learning Resources

• Notes by the instructor will be provided.

• Recommended readings:

– Gene H. Golub and Charles F. van Loan, Matrix Computations (Fourth Edition),John Hopkins University Press, 2013.

– Roger A. Horn and Charles R. Johnson, Matrix Analysis (Second Edition),Cambridge University Press, 2012.

– Jan R. Magnus and Heinz Neudecker, Matrix Differential Calculus with Appli-cations in Statistics and Econometrics (Third Edition), John Wiley and Sons,New York, 2007.

– Giuseppe Calafiore and Laurent El Ghaoui, Optimization Models, CambridgeUniversity Press, 2014.

– ECE 712 Course Notes by Prof. Jim Reilly, McMaster University, Canada(friendly notes for engineers)http://www.ece.mcmaster.ca/faculty/reilly/ece712/course_notes.htm

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 4

Page 6: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Assessment and Academic Honesty

• Assessment:

– Assignments: 60%

∗ may contain MATLAB questions

∗ where to submit: I will put a submission box outside my office, SHB 323

∗ no late submissions would be accepted, except for exceptional cases.

– Final examination: 40%

• Academic honesty: Students are strongly advised to read

– our homework guideline: http://www.ee.cuhk.edu.hk/~wkma/engg5781/

hw/hw_guidelines.pdf

– the University’s guideline on academic honesty: http://www.cuhk.edu.hk/

policy/academichonesty

You will be assumed to understand the aspects described therein.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 5

Page 7: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Additional Notice

• Sitting in is welcome, and please send me your e-mail address to keep you updatedwith the course.

• Course helpers whom you can consult:

– Zihao Huang [email protected]

– Yuening Li, [email protected]– Qiong Wu, [email protected]

• You can also get consultation from me; send me an email for an appointment

• Do regularly check your CUHK Link e-mail address; this is the only way we canreach you

• CUHK Blackboard will be used to announce scores and for online homeworksubmission

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 6

Page 8: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

A Glimpse of Topics

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 7

Page 9: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Least Squares (LS)

• Problem: given A ∈ Rm×n, y ∈ Rn, solve

minx∈Rn

‖y −Ax‖22,

where ‖ · ‖2 is the Euclidean norm; i.e., ‖x‖2 =√∑n

i=1 |xi|2.

• widely used in science, engineering, and mathematics

• assuming a tall and full-rank A, the LS solution is uniquely given by

xLS = (ATA)−1ATy.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 8

Page 10: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Application Example: Linear Prediction (LP)

• let ytt≥0 be a time series.

• Model (autoregressive (AR) model):

yt = a1yt−1 + a2yt−2 + · · ·+ aqyt−q + vt, t = 0, 1, 2, . . .

for some coefficients aiqi=1, where vt is noise or modeling error.

• Problem: estimate aiqi=1 from ytt≥0; can be formulated as LS

• Applications: time-series prediction, speech analysis and coding, spectralestimation. . .

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 9

Page 11: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

A Toy Demo: Predicting Hang Seng Index

0 10 20 30 40 50 601.9

1.95

2

2.05

2.1

2.15

2.2

2.25

2.3x 10

4

day

Ha

ng S

eng

Index

Hang Seng Index

Linear Prediction

blue— Hang Seng Index during a certain time period.

red— training phase; the line is∑q

i=1 aiyt−i; a is obtained by LS; q = 10.

green— prediction phase; the line is yt =∑q

i=1 aiyt−i; the same a in the training phase.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 10

Page 12: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

A Real Example: Real-Time Prediction of Flu Activity

Tracking influenza outbreaks by ARGO — a model combining the AR model and Google search data.

Source: [Yang-Santillana-Kou2015].

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 11

Page 13: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Eigenvalue Problem

• Problem: given A ∈ Rn×n, find a v ∈ Rn such that

Av = λv, for some λ.

• Eigendecomposition: let A ∈ Rn×n be symmetric; i.e., aij = aji for all i, j. Italso admits a decomposition

A = QΛQT ,

where Q ∈ Rn×n is orthogonal, i.e., QTQ = I; Λ = Diag(λ1, . . . , λn)

• also widely used, either as an analysis tool or as a computational tool

• no closed form in general; can be numerically computed

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 12

Page 14: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Application Example: PageRank

• PageRank is an algorithm used by Google to rank the pages of a search result.

• the idea is to use counts of links of various pages to determine pages’ importance.

Source: Wiki.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 13

Page 15: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

One-Page Explanation of How PageRank Works

• Model: ∑j∈Li

vjcj

= vi, i = 1, . . . , n,

where cj is the number of outgoing links from pagej; Li is the set of pages with a link to page i; vi isthe importance score of page i.

• as an example,

A︷ ︸︸ ︷0 1

2 1 13

0 0 0 13

0 12 0 1

30 0 0 0

v︷ ︸︸ ︷v1

v2

v3

v4

=

v︷ ︸︸ ︷v1

v2

v3

v4

.• finding v is an eigenvalue problem—with n being of order of millions!

• further reading: [Bryan-Tanya2006]

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 14

Page 16: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Low-Rank Matrix Approximation

• Problem: given Y ∈ Rm×n and an integer r < minm,n, find an (A,B) ∈Rm×r × Rr×n such that either Y = AB or Y ≈ AB.

• Formulation:min

A∈Rm×r,B∈Rr×n‖Y −AB‖2F ,

where ‖ · ‖F is the Frobenius, or matrix Euclidean, norm.

• Applications: dimensionality reduction, extracting meaningful features fromdata, low-rank modeling, . . .

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 15

Page 17: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Application Example: Image Compression

• let Y ∈ Rm×n be an image.

(a) original image, size= 102 × 1347

• store the low-rank factor pair (A,B), instead of Y.

(b) truncated SVD, k= 5

(c) truncated SVD, k= 10

(d) truncated SVD, k= 20

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 16

Page 18: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Application Example: Principal Component Analysis (PCA)

• Aim: given a set of data points y1,y2, . . . ,yn ⊂ Rn and an integer k <minm,n, perform a low-dimensional representation

yi = Qci + µ + ei, i = 1, . . . , n,

where Q ∈ Rm×k is a basis; ci’s are coefficients; µ is a base; ei’s are errors

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 17

Page 19: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Toy Demo: Dimensionality Reduction of a Face Image Dataset

A face image dataset. Image size = 112 × 92, number of face images = 400. Each xi is the

vectorization of one face image, leading to m = 112× 92 = 10304, n = 400.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 18

Page 20: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Toy Demo: Dimensionality Reduction of a Face Image Dataset

Mean face 1st principal leftsingular vector

2nd principal leftsingular vector

3rd principal leftsingular vector

400th left singu-lar vector

0 50 100 150 200 250 300 350 40010

0

101

102

103

104

105

Rank

En

erg

y

Energy Concentration

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 19

Page 21: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Singular Value Decomposition (SVD)

• SVD: Any Y ∈ Rm×n can be decomposed into

Y = UΣVT ,

where U ∈ Rm×m,V ∈ Rn×n are orthogonal; Σ ∈ Rm×n takes a diagonal form.

• also a widely used analysis and computational tool; can be numerically computed

• SVD can be used to solve the low-rank matrix approximation problem

minA∈Rm×r,B∈Rr×n

‖Y −AB‖2F .

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 20

Page 22: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Linear System of Equations

• Problem: given A ∈ Rn×n, y ∈ Rn, solve

Ax = y.

• Question 1: How to solve it?

– don’t tell me answers like x=inv(A)*y or x= A\y on MATLAB!

– this is about matrix computations

• Question 2: How to solve it when n is very large?

– it’s too slow to do the generic trick x= A\y when n is very large

– getting better understanding of matrix computations will enable you to exploitproblem structures to build efficient solvers

• Question 3: How sensitive is the solution x when A and y contain errors?

– key to system analysis, or building robust solutions

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 21

Page 23: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Why Matrix Analysis and Computations is Important?

• as said, areas such as signal processing, image processing, machine learning, op-timization, computer vision, control, communications, . . ., use matrix operationsextensively

• it helps you build the foundations for understanding “hot” topics such as

– sparse recovery;

– structured low-rank matrix approximation; matrix completion.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 22

Page 24: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

The Sparse Recovery Problem

Problem: given y ∈ Rm, A ∈ Rm×n, m < n, find a sparsest x ∈ Rn such that

y = Ax.

measurements sparse vectorwith few nonzero entries

• by sparsest, we mean that x should have as many zero elements as possible.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 23

Page 25: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Application: Magnetic resonance imaging (MRI)

Problem: MRI image reconstruction.

(a) (b)

Fig. a shows the original test image. Fig. b shows the sampling region in the frequency domain. Fourier

coefficients are sampled along 22 approximately radial lines. Source: [Candes-Romberg-Tao2006]

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 24

Page 26: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Application: Magnetic resonance imaging (MRI)

Problem: MRI image reconstruction.

(c) (d)

Fig. c is the recovery by filling the unobserved Fourier coefficients to zero. Fig. d is the recovery by a

sparse recovery solution. Source: [Candes-Romberg-Tao2006]

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 25

Page 27: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Low-Rank Matrix Completion• Application: recommendation systems

– in 2009, Netflix awarded $1 million to a team that performed best in recom-mending new movies to users based on their previous preference1 .

• let Z be a preference matrix, where zij records how user i likes movie j.

movies

Z =

2 3 1 ? ? 5 51 ? 4 2 ? ? ?? 3 1 ? 2 2 2? ? ? 3 ? 1 5

users

– some entries zij are missing, since no one watches all movies.– Z is assumed to be of low rank; research shows that only a few factors affect

users’ preferences.

• Aim: guess the unkown zij’s from the known ones.1www.netflixprize.com

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 26

Page 28: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Low-Rank Matrix Completion

• The 2009 Netflix Grand Prize winners used low-rank matrix approximations[Koren-Bell-Volinsky2009].

• Formulation (oversimplified):

minA∈Rm×r,B∈Rr×n

∑(i,j)∈Ω

|zij − [AB]i,j|2

where Ω is an index set that indicates the known entries of Z.

• cannot be solved by SVD

• in the recommendation system application, it’s a large-scale problem

• alternating LS may be used

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 27

Page 29: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Toy Demonstration of Low-Rank Matrix Completion

Left: An incomplete image with 40% missing pixels. Right: the low-rank matrix completion result.

r = 120.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 28

Page 30: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Nonnegative Matrix Factorization (NMF)

• Aim: we want the factors to be non-negative

• Formulation:

minA∈Rm×r,B∈Rr×n

‖Y −AB‖2F s.t. A ≥ 0,B ≥ 0,

where X ≥ 0 means that xij ≥ 0 for all i, j.

• arguably a topic in optimization, but worth noticing

• found to be able to extract meaningful features (by empirical studies)

• numerous applications, e.g., in machine learning, signal processing, remote sensing

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 29

Page 31: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

NMF Examples

• Image Processing:

© 1999 Macmillan Magazines Ltd

letters to nature

NATURE | VOL 401 | 21 OCTOBER 1999 | www.nature.com 789

PCA constrains the columns of W to be orthonormal and therows of H to be orthogonal to each other. This relaxes the unaryconstraint of VQ, allowing a distributed representation in whicheach face is approximated by a linear combination of all the basisimages, or eigenfaces6. A distributed encoding of a particular face isshown next to the eigenfaces in Fig. 1. Although eigenfaces have astatistical interpretation as the directions of largest variance, manyof them do not have an obvious visual interpretation. This isbecause PCA allows the entries of W and H to be of arbitrary sign.As the eigenfaces are used in linear combinations that generallyinvolve complex cancellations between positive and negativenumbers, many individual eigenfaces lack intuitive meaning.

NMF does not allow negative entries in the matrix factors W andH. Unlike the unary constraint of VQ, these non-negativity con-straints permit the combination of multiple basis images to repre-sent a face. But only additive combinations are allowed, because thenon-zero elements of W and H are all positive. In contrast to PCA,no subtractions can occur. For these reasons, the non-negativityconstraints are compatible with the intuitive notion of combiningparts to form a whole, which is how NMF learns a parts-basedrepresentation.

As can be seen from Fig. 1, the NMF basis and encodings containa large fraction of vanishing coefficients, so both the basis imagesand image encodings are sparse. The basis images are sparse becausethey are non-global and contain several versions of mouths, nosesand other facial parts, where the various versions are in differentlocations or forms. The variability of a whole face is generated bycombining these different parts. Although all parts are used by at

least one face, any given face does not use all the available parts. Thisresults in a sparsely distributed image encoding, in contrast to theunary encoding of VQ and the fully distributed PCA encoding7–9.

We implemented NMF with the update rules for Wand H given inFig. 2. Iteration of these update rules converges to a local maximumof the objective function

F ¼ ^n

i¼1^

m

m¼1

½VimlogðWHÞim 2 ðWHÞimÿ ð2Þ

subject to the non-negativity constraints described above. Thisobjective function can be derived by interpreting NMF as amethod for constructing a probabilistic model of image generation.In this model, an image pixel Vim is generated by adding Poissonnoise to the product (WH)im. The objective function in equation (2)is then related to the likelihood of generating the images in V fromthe basis W and encodings H.

The exact form of the objective function is not as crucial as thenon-negativity constraints for the success of NMF in learning parts.A squared error objective function can be optimized with updaterules for W and H different from those in Fig. 2 (refs 10, 11). Theseupdate rules yield results similar to those shown in Fig. 1, but havethe technical disadvantage of requiring the adjustment of a parametercontrolling the learning rate. This parameter is generally adjustedthrough trial and error, which can be a time-consuming process ifthe matrix V is very large. Therefore, the update rules described inFig. 2 may be advantageous for applications involving large data-bases.

VQ

× =

NMF

PCA

Original Figure 1 Non-negative matrix factorization (NMF) learns a parts-based representation offaces, whereas vector quantization (VQ) and principal components analysis (PCA) learnholistic representations. The three learning methods were applied to a database ofm ¼ 2;429 facial images, each consisting of n ¼ 19 3 19 pixels, and constituting ann 3 m matrix V. All three find approximate factorizations of the form V < WH, but withthree different types of constraints on W and H, as described more fully in the main textand methods. As shown in the 7 3 7 montages, each method has learned a set ofr ¼ 49 basis images. Positive values are illustrated with black pixels and negative valueswith red pixels. A particular instance of a face, shown at top right, is approximatelyrepresented by a linear superposition of basis images. The coefficients of the linearsuperposition are shown next to each montage, in a 7 3 7 grid, and the resultingsuperpositions are shown on the other side of the equality sign. Unlike VQ and PCA, NMFlearns to represent faces with a set of basis images resembling parts of faces.The basis elements extract facial features such as eyes, nose and lips. Source:

[Lee-Seung1999].

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 30

Page 32: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

• Text Mining:

– basis elements allow us to recover different topics;– weights allow us to assign each text to its corresponding topics.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 31

Page 33: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Toy Demonstration of NMF

A face image dataset. Image size = 101 × 101, number of face images = 13232. Each xi is the

vectorization of one face image, leading to m = 101× 101 = 10201, n = 13232.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 32

Page 34: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Toy Demonstration of NMF: NMF-Extracted Features

NMF settings: r = 49, Lee-Seung multiplicative update with 5000 iterations.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 33

Page 35: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

Toy Demonstration of NMF: Comparison with PCA

Mean face 1st principal leftsingular vector

2nd principal leftsingular vector

3th principal leftsingular vector

last principal leftsingular vector

0 1000 2000 3000 4000 5000 6000 7000 8000 9000 1000010

−3

10−2

10−1

100

101

102

103

104

105

106

Rank

En

erg

y

Energy Concentration

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 34

Page 36: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

A Few More Words to Say

• things I hope you will learn

– how to read how people manipulate matrix operations, and how you canmanipulate them (learn to use a tool);

– what applications we can do, or to find new applications of our own (learn toapply a tool);

– deep analysis skills (why is this tool valid? Can I invent new tools? Key tosome topics, should go through at least once in your life time)

• feedbacks are welcome; closed-loop systems often work better than open-loop

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 35

Page 37: ENGG5781 Matrix Analysis and Computations Lecture 0: Overviewkma/engg5781/new_notes/lecture... · Course Information W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019{2020

References

[Yang-Santillana-Kou2015] S. Yang, M. Santillana, and S. C. Kou, “Accurate estimation of

influenza epidemics using Google search data via ARGO,” Proceedings of the National Academy of

Sciences, vol. 112, no. 47, pp. 14473–14478, 2015.

[Bryan-Tanya2006] K. Bryan and L. Tanya, “The 25, 000, 000, 000 eigenvector: The linear

algebra behind Google,” SIAM Review, vol. 48, no. 3, pp. 569–581, 2006.

[Candes-Romberg-Tao2006] E. J. Candes, J. Romberg, and T. Tao, “Robust uncertainty

principles: Exact signal reconstruction from highly incomplete frequency information,” IEEE Trans.

Information Theory, vol. 52, no. 2, pp. 489–509, 2006.

[Koren-Bell-Volinsky2009] B. Koren, R. Bell, and C. Volinsky, “Matrix factorization techniques for

recommender systems,” IEEE Computer, vol. 42 no. 8, pp. 30–37, 2009.

[Lee-Seung1999] D. D. Lee and H. S. Seung, “Learning the parts of objects by non-negative matrix

factorization,” Nature, vol. 401, no. 6755, pp. 788–791, 1999.

W.-K. Ma, ENGG5781 Matrix Analysis and Computations, CUHK, 2019–2020 Term 1. 36