lecture 11 fundamental theorems of linear algebra orthogonalily and projection shang-hua teng

25
Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection Shang-Hua Teng

Post on 22-Dec-2015

223 views

Category:

Documents


1 download

TRANSCRIPT

Lecture 11Fundamental Theorems of Linear

Algebra

Orthogonalily and Projection

Shang-Hua Teng

The Whole Picture• Rank(A) = m = n Ax=b has unique solution IR

FIR

0

IR

00

FIR

• Rank(A) = m < n Ax=b has n-m dimensional solution

• Rank(A) = n < m Ax=b has 0 or 1 solution

• Rank(A) < n, Rank(A) < m Ax=b has 0 or n-rank(A) dimensions

Basis and Dimension of a Vector Space

• A basis for a vector space is a sequence of vectors that – The vectors are linearly independent– The vectors span the space: every vector in the

vector can be expressed as a linear combination of these vectors

Basis for 2D and n-D

• (1,0), (0,1)

• (1 1), (-1 –2)

• The vectors v1,v2,…vn are basis for Rn if and only if they are columns of an n by n invertible matrix

Column and Row Subspace

• C(A): the space spanned by columns of A– Subspace in m dimensions

– The pivot columns of A are a basis for its column space

• Row space: the space spanned by rows of A– Subspace in n dimensions

– The row space of A is the same as the column space of AT, C(AT)

– The pivot rows of A are a basis for its row space

– The pivot rows of its Echolon matrix R are a basis for its row space

Important Property I: Uniqueness of Combination

• The vectors v1,v2,…vn are basis for a vector space V, then for every vector v in V, there is a unique way to write v as a combination of v1,v2,…vn .

v = a1 v1+ a2 v2+…+ an vn

v = b1 v1+ b2 v2+…+ bn vn

• So: 0=(a1 - b1) v1 + (a2 -b2 )v2+…+ (an -bn )vn

Important Property II: Dimension and Size of Basis

• If a vector space V has two set of bases– v1,v2,…vm . V = [v1,v2,…vm ]– w1,w2,…wn . W= [w1,w2,…wn ].

• then m = n– Proof: assume n > m, write W = VA– A is m by n, so Ax = 0 has a non-zero solution– So VAx = 0 and Wx = 0

• The dimension of a vector space is the number of vectors in every basis– Dimension of a vector space is well defined

Dimensions of the Four SubspacesFundamental Theorem of Linear

Algebra, Part I• Row space: C(AT) – dimension = rank(A)

• Column space: C(A)– dimension = rank(A)

• Nullspace: N(A) – dimension = n-rank(A)

• Left Nullspace: N(AT) – dimension = m –rank(A)

Orthogonality and Orthogonal Subspaces

• Two vectors v and w are orthogonal if 0 vwwvwv TT

• Two vector subspaces V and W are orthogonal if

0 , and allfor wvWwVv T

Example: Orthogonal Subspace in 5 Dimensions

1

1

0

0

0

,

1

0

0

0

0

0

0

1

1

0

,

0

0

0

1

1

,

0

0

0

0

1

CC

The union of these two subspaces is R5

Orthogonal Complement

• Suppose V is a vector subspace a vector space W

• The orthogonal complement of V is

}such that { VwWwV

• Orthogonal complement is itself a vector subspace

)dim()dim()dim( WVV

Dimensions of the Four SubspacesFundamental Theorem of Linear

Algebra, Part I• Row space: C(AT) – dimension = rank(A)

• Column space: C(A)– dimension = rank(A)

• Nullspace: N(A) – dimension = n-rank(A)

• Left Nullspace: N(AT) – dimension = m –rank(A)

Orthogonality of the Four SubspacesFundamental Theorem of Linear

Algebra, Part II• The nullspace is the

orthogonal complement of the row space in Rn

• The left Nullspace is the orthogonal complement of the column space in Rm

)()( TACAN

))(()( ACAN T

Proof

• The nullspace is the orthogonal complement of the row space in Rn

)()( TACAN

0

implying

:)(

0:)(

AxyAxyxyAyAx

RyyAAC

AxxAN

TTTTTT

mTT

The Whole Picture

C(AT)

N(A)

Rn

Rm

C(A)

N(AT)

xn A xn= 0

xr

b

A xr= b

nr xxx

A x= b

dim r dim r

dim n- r

dim m- r

Uniqueness of The Typical Solution

• Every vector in the column space comes from one and only one vector xr from the row space

• Proof: suppose there are two xr , yr from the row space such that Axr =A yr =b, then

Axr -A yr = A(xr -yr ) = 0

(xr -yr ) is in row space and nullspace hence must be 0

• The matching of dim in row and column spaces

Deep Secret of Linear AlgebraPseudo-inverse

• Throw away the two null spaces, there is an r by r invertible matrix hiding insider A.

• In some sense, from the row space to the column space, A is invertible

• It maps an r-space in n space to an r-space in m-space

Invertible Matrices

• Any n linearly independent vector in Rn must span Rn . They are basis.

• So Ax = b is always uniquely solvable

• A is invertible

Projection

• Projection onto an axis

(a,b)

x axis is a vector subspace

Projection onto an Arbitrary Line Passing through 0

(a,b)

Projection on to a Plane

Projection onto a Subspace

• Input: 1. Given a vector subspace V in Rm

2. A vector b in Rm…

• Desirable Output:– A vector in x in V that is closest to b– The projection x of b in V– A vector x in V such that (b-x) is orthogonal

to V

How to Describe a Vector Subspace V in Rm

• If dim(V) = n, then V has n basis vectors– a1, a2, …, an

– They are independent

• V = C(A) where A = [a1, a2, …, an]

Projection onto a Subspace

• Input: 1. Given n independent vectors a1, a2, …, an in Rm

2. A vector b in Rm…

• Desirable Output:– A vector in x in C([a1, a2, …, an]) that is closest to b

– The projection x of b in C([a1, a2, …, an])

– A vector x in V such that (b-x) is orthogonal to C([a1, a2, …, an])

Think about this Picture

C(AT)

N(A)

Rn

Rm

C(A)

N(AT)

xn A xn= 0

xr

b

A xr= b

nr xxx

A x= b

dim r dim r

dim n- r

dim m- r