linear algebra prerequisites jana kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf ·...

23
1 Linear Algebra Prerequisites Jana Kosecka [email protected] Recommended txt: Linear Algebra and its applications by G. Strang Why do we need Linear Algebra? We will associate coordinates to 3D points in the scene 2D points in the CCD array 2D points in the image Coordinates/Data points will be used to Perform geometrical transformations – Associate 3D with 2D points Images are matrices of numbers We will find properties of these numbers

Upload: others

Post on 04-Nov-2019

95 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

1

Linear Algebra Prerequisites

Jana [email protected]

Recommended txt: Linear Algebra and its applications by G. Strang

Why do we need Linear Algebra?• We will associate coordinates to

– 3D points in the scene– 2D points in the CCD array– 2D points in the image

• Coordinates/Data points will be used to– Perform geometrical transformations– Associate 3D with 2D points

• Images are matrices of numbers– We will find properties of these numbers

Page 2: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

2

Matrices

mnmnmn BAC ××× +=Sum:

ijijij bac +=

!"

#$%

&=!

"

#$%

&+!"

#$%

&

6478

5126

1352

Example:

A and B must have the same dimensions

Matrices

pmmnpn BAC ××× =

Product:

∑=

=m

kkjikij bac

1

!"

#$%

&=!

"

#$%

&!"

#$%

&

11192917

5126

.1352

Examples:

!"

#$%

&=!

"

#$%

&!"

#$%

&

10173218

1352

.5126

nnnnnnnn ABBA ×××× ≠

A and B must have compatible dimensions

Page 3: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

3

Matrices

mnT

nm AC ×× =Transpose:

jiij ac = TTT ABAB =)(

TTT BABA +=+ )(

If AAT = A is symmetric

Examples:

!"

#$%

&=!

"

#$%

&

5216

5126 T

!"

#$%

&=

!!!

"

#

$$$

%

&

852316

835126 T

Matrices

Determinant:

131521352

det −=−="#

$%&

'Example:

A must be square

3231

222113

3331

232112

3332

232211

333231

232221

131211

detaaaa

aaaaa

aaaaa

aaaaaaaaaa

+−=

"""

#

$

%%%

&

'

122122112221

1211

2221

1211det aaaaaaaa

aaaa

−=="#

$%&

'

Page 4: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

4

Matrices

IAAAA nnnnnnnn == ××−

×−

×11

Inverse: A must be square

!"

#$%

&

−=!

"

#$%

&−

1121

1222

12212211

1

2221

1211 1aaaa

aaaaaaaa

Example: !"

#$%

&

−=!

"

#$%

&−

6125

281

5126 1

!"

#$%

&=!

"

#$%

&=!

"

#$%

&!"

#$%

&

−=!

"

#$%

&!"

#$%

&−

1001

280028

281

5126

.6125

281

5126

.5126 1

2D,3D Vectors

P

x1

x2

qv

Magnitude: 22

21|||| xx +=v

Orientation: !!"

#$$%

&= −

1

21tanxx

θ

!!"

#$$%

&=

||||,||||||||

21

vvvv xx Is a unit vector

If 1|||| =v , v is a UNIT vector

Page 5: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

5

Vector Addition

vu

u+v

Vector Subtraction

uv

u-v

Page 6: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

6

Scalar Product

vav

Inner (dot) Product

vu

aThe inner product is a SCALAR!

norm of a vector

Page 7: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

7

Vector (cross) Product

wvu ×=

The cross product is a VECTOR!

wv

au

Orientation:

αsin|||||||||.|| ||u|| wvwv ==Magnitude:

Standard base vectors:

Coordinates of a point in space:

Orthonormal Basis in 3D

Page 8: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

8

Vector (Cross) Product Computation

wv

au

Skew symmetric matrix associated with vector

Matrices

n x m matrix

transformationm points from n-dimensional space

meaning

Covariance matrix – symmetricSquare matrix associated with The data points (after mean has been subtracted) in 2D Special case

matrix is square

Page 9: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

9

Geometric interpretation

Lines in 2D space - row solutionEquations are considered isolation

Linear combination of vectors in 2DColumn solution

We already know how to multiply the vector by scalar

Linear equations

When is RHS a linear combination of LHS

Solving linear n equations with n unknowsIf matrix is invertible - compute the inverseColumns are linearly independent

In 3D

Page 10: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

10

Linear equations

Not all matrices are invertible

- inverse of a 2x2 matrix (determinant non-zero)- inverse of a diagonal matrix

Computing inverse - solve for the columns Independently or using Gauss-Jordan method

Vector spaces (informally)

• Vector space in n-dimensional space • n-dimensional columns with real entries • Operations of addition, multiplication and scalar

multiplication• Additions of the vectors and multiplication of a vector

by a scalar always produces vectors which lie in the space

• Matrices also make up vector space - e.g. consider all 3x3 matrices as elements of space

Page 11: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

11

Vector subspaceA subspace of a vector space is a non-empty set Of vectors closed under vector addition and scalarmultiplicationExample: over constrained system - more equations then unknowns

The solution exists if b is in the subspace spanned by vectors u and v

Linear Systems - Nullspace

1. When matrix is square and invertible2. When the matrix is square and noninvertible3. When the matrix is non-square with more

constraints then unknowns

Solution exists when b is in column space of ASpecial case

All the vectors which satisfy lie in theNULLSPACE of matrix A (see later)

Page 12: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

12

Basis

n x n matrix A is invertible if it is of a full rank

Rank of the matrix - number of linearly independent rows (see definition next page)

If the rows or columns of the matrix A are linearly independent - the null space of contains only 0 vector

Set of linearly independent vectors forms a basis of the vector space

Given a basis, the representation of every vector is uniqueBasis is not unique ( examples)

Linear Equations

Vector space spanned by columns of A

• Column space of A – dimension of C(A) number of linearly independent columnsr = rank(A)

• Row space of A - dimension of R(A)number of linearly independent rowsr = rank(AT)

• Null space of A - dimension of N(A) n - r• Left null space of A – dimension of N(A^T) m – r• Maximal rank - min(n,m) – smaller of the two dimensions

Four basic subspacesIn general

Page 13: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

13

Nu(A )

Structure induced by a linear map A

A

T

T

Ra(A)

Nu(A)

Ra(A )

X X�

Nu(A)

T

T

Ra(A)

Row space of A

Null space of ATNull space of A

Row space of AT

Linear Equations

Vector space spanned by columns of A

• if n < m number of equations is less then number of unknowns, the set of solutions is (m-n) dimensional vector subspace of R^m

• if n = m there is a unique solution

• if n > m number of equations is more then number of unknowns, there is no solution

Four cases, suppose that the matrix A has full rankThen:

In general

Page 14: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

14

Linear Equations – Square Matrices

1. A is square and invertible2. A is square and non-invertible

1. System Ax = b has at most one solution –columns are linearly independent rank = n- then the matrix is invertible

2. Columns are linearly dependent rank < n- then the matrix is not invertible

Linear Equations – non-square matrices

The solution exist when b is aligned with [2,3,4]^TIf not we have to seek some approximation – least squares Approximation – minimize squared error

Least squares solution - find such value of x that the error Is minimized (take a derivative, set it to zero and solve for x)

Long-thin matrixover-constrained

system

Short for such solution

Page 15: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

15

Linear equations – non-squared matrices

• If A has linearly independent columns ATA is square, symmetric and invertible

Similarly when A is a matrix

is so called pseudoinverse of matix AIn Matlab A’ = pinv(A)

Eigenvalues and Eigenvectors

eigenvectoreigenvalue

Solve the equation:

x – is in the null space of l is chosen such that has a null space

(1)

For larger matrices – alternative ways of computation

Computation of eigenvalues and eigenvectors (for dim 2,3)1. Compute determinant2. Find roots (eigenvalues) of the polynomial such that determinant = 03. For each eigenvalue solve the equation (1)

In Matlab [vec, val] = eig(A)

Page 16: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

16

Eigenvalues and Eigenvectors

eigenvectoreigenvalue

Solve the equation:

x – is in the null space of l is chosen such that has a null space

(1)

For larger matrices – alternative ways of computation

Computation of eigenvalues and eigenvectors (for dim 2,3)1. Compute determinant2. Find roots (eigenvalues) of the polynomial such that determinant = 03. For each eigenvalue solve the equation (1)

In Matlab [vec, val] = eig(A)

Square Matrices - Eigenvalues and Eigenvectors

For the previous example

We will get special solutions to ODE

Their linear combination is also a solution (due to the linearity of )

In the context of diff. equations – special meaning Any solution can be expressed as linear combination

Individual solutions correspond to modes

Page 17: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

17

Eigenvalues and Eigenvectors

• For square matrices

We look for the solutions of the following type exponentials

• Motivated by solution to differential equations

For scalar ODE�s

Substitute back to the equation

Eigenvalues and Eigenvectors - Diagonalization

• Given a square matrix A and its eigenvalues and eigenvectors – matrix can be diagonalized

Matrix of eigenvectors Diagonal matrix of eigenvalues

• If some of the eigenvalues are the same, eigenvectorsare not independent

Page 18: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

18

Eigenvalues and Eigenvectors - Diagonalization

• Given a square matrix A and its eigenvalues and eigenvectors – matrix can be diagonalized

• This diagonalization is useful for computing inverse• General rule for inverse

• In this case

• Inverse of a diagonal matrix is 1/xi for all diagonal elements x of (works for non-zero eigenvalues)

Matrix of eigenvectors Diagonal matrix of eigenvalues

(AB)−1 = B−1A−1

A−1 = (SΛS−1)−1 = SΛ−1S−1

Λ

Diagonalization

• If there are no zero eigenvalues – matrix is invertible• If there are no repeated eigenvalues – matrix is diagonalizable • If all the eigenvalues are different then eigenvectors are linearly

independent

For Symmetric Matrices

If A is symmetric

orthonormal matrix of eigenvectors

i.e. for a covariance matrix

Diagonal matrix of eigenvalues

or some matrix B = A^TA

Page 19: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

19

Singular Value Decomposition

Previously eigenvectors and eigenvalues for square matrices

Singular value decomposition: Factorization of real or complex matrix m x n into a form

Where U is m x m with eigenvectors of AA*V is n x n matrix with eigenvectors of A*AS is m x n rectangular diagonal matrix of singular values

Where A* is transpose for real valued matrices or conjugate transPose for matrices with complex entries

A = USV T

Singular Value Decomposition

Previously eigenvectors and eigenvalues for square matrices

Where U is m x m with eigenvectors of AA*V is n x n matrix with eigenvectors of A*AS is m x n rectangular diagonal matrix of singular values

Where A* is transpose for real valued matrices or conjugate transPose for matrices with complex entries

Relationship to pseudo-inverse: to compute pseudoinverse take the the reciprocal elements of the diagonal matrix S

In Matlab:

A = USV T

[m,n]=size(A);[U,S,V]=svd(A);r=rank(S);SR=S(1:r,1:r);SRc=[SR^-1 zeros(r,m-r);zeros(n-r,r) zeros(n-r,m-r)];A_pseu=V*SRc*U.';

A+ =US+VT

Page 20: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

20

Homogeneous Systems of equations

• When matrix is square and non-singular, there aUnique trivial solution x = 0

• If m >= n there is a non-trivial solution when rank of Ais rank(A) < n • We need to impose some constraint to avoid trivial solution, for example

• Find such x that is minimized

Solution: eigenvector associated with the smallest eigenvalue

Linear regression Least squares line fitting• Data: (x1, y1), …, (xn, yn)• Line equation: yi = m xi + b• Find (m, b) to minimize

022 =-= YXXBXdBdE TT

)()()(2)()(

1

1

2

11

XBXBYXBYYXBYXBYXBYE

bm

Bx

xX

y

yY

TTTT

nn

+-=--=-=

úû

ùêë

é=

úúú

û

ù

êêê

ë

é=

úúú

û

ù

êêê

ë

é= !!!

Normal equations: least squares solution to XB=Y

å =--=

n

i ii bxmyE1

2)(

(xi, yi)

y=mx+b

YXXBX TT =

Page 21: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

21

Problem with “vertical” least squares

• Not rotation-invariant• Fails completely for vertical lines

Total least squares

• Distance between point (xi, yi) and line ax+by=d (a2+b2=1): |axi + byi – d|

å =-+=

n

i ii dybxaE1

2)( (xi, yi)

ax+by=dUnit normal:

N=(a, b)

Page 22: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

22

Total least squares

• Distance between point (xi, yi) and line ax+by=d (a2+b2=1): |axi + byi – d|• Find (a, b, d) to minimize the sum of squared perpendicular distances

å =-+=

n

i ii dybxaE1

2)( (xi, yi)

ax+by=d

å =-+=

n

i ii dybxaE1

2)(

Unit normal: N=(a, b)

Total least squares• Distance between point (xi, yi) and line ax+by=d (a2+b2=1): |axi + byi – d|• Find (a, b, d) to minimize the sum of squared perpendicular distances å =

-+=n

i ii dybxaE1

2)( (xi, yi)

ax+by=d

å =-+=

n

i ii dybxaE1

2)(

Unit normal: N=(a, b)

0)(21

=-+-=¶¶ å =

n

i ii dybxadE ybxay

nb

xna

d n

i in

i i +=+= åå == 11

)()())()((

211

12 UNUN

ba

yyxx

yyxxyybxxaE T

nn

n

i ii =úû

ùêë

é

úúú

û

ù

êêê

ë

é

--

--=-+-=å =

!!

0)(2 == NUUdNdE T

Solution to (UTU)N = 0, subject to ||N||2 = 1: eigenvector of UTUassociated with the smallest eigenvalue (least squares solution to homogeneous linear system UN = 0)

Page 23: Linear Algebra Prerequisites Jana Kosecka kosecka@gmukosecka/cs687/linear-algebra-review.pdf · Linear Algebra Prerequisites Jana Kosecka kosecka@gmu.edu Recommended txt: Linear Algebra

23

Total least squares

úúú

û

ù

êêê

ë

é

--

--=

yyxx

yyxxU

nn

!!11

úúúú

û

ù

êêêê

ë

é

---

---=

åå

åå

==

==n

ii

n

iii

n

iii

n

ii

T

yyyyxx

yyxxxxUU

1

2

1

11

2

)())((

))(()(

second moment matrix

Total least squares

úúú

û

ù

êêê

ë

é

--

--=

yyxx

yyxxU

nn

!!11

úúúú

û

ù

êêêê

ë

é

---

---=

åå

åå

==

==n

ii

n

iii

n

iii

n

ii

T

yyyyxx

yyxxxxUU

1

2

1

11

2

)())((

))(()(

),( yx

N = (a, b)

second moment matrix

),( yyxx ii --