1 mac 2103 module 11 lnner product spaces ii. 2 rev.f09 learning objectives upon completing this...
DESCRIPTION
3 Rev.09 General Vector Spaces II Click link to download other modules. Orthogonal Bases, Gram-Schmidt Process, Least Squares and Best Approximation The major topics in this module:TRANSCRIPT
1
MAC 2103Module 11
lnner Product Spaces II
2Rev.F09
Learning Objectives
Upon completing this module, you should be able to:
1. Construct an orthonormal set of vectors from an orthogonal set of vectors.
2. Find the coordinate vector with respect to a given orthonormal basis.
3. Construct an orthogonal basis from a nonstandard basis in ⁿℜ using the Gram-Schmidt process.
4. Find the least squares solution to a linear system Ax = b. 5. Find the orthogonal projection on col(A).6. Obtain the best approximation.
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
3Rev.09
General Vector Spaces II
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
Orthogonal Bases, Gram-Schmidt Process,Orthogonal Bases, Gram-Schmidt Process,Least SquaresLeast Squares and and Best ApproximationBest Approximation
The major topics in this module:
4Rev.F09
How to Construct an Orthonormal Set of Vectorsfrom an Orthogonal Set of Vectors?
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
We have learned from the previous module that two vectors u and v in an inner product space V are orthogonal to each other iff <u,v> = 0.
To obtain an orthonormal set, we will normalize each of the vectors in the orthogonal set.
How to normalize the vectors? This can be done by dividing each of them by their respective norm and making each of them a unit vector.
5Rev.F09
How to Construct an Orthonormal Set of Vectorsfrom an Orthogonal Set of Vectors? (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
Example 1: Find the orthonormal set of vectors from the following set of vectors: Let S ={v1, v2} where v1 = (5,0) and v2 = (0,-3).
Step 1: Verify that the set of vectors are mutually orthogonal with respect to the Euclidean inner product on ²ℜ .
Step 2: Find the norm for both vectors.
⟨rv1,rv2⟩=
rv1 ⋅rv2 = (5)(0) + (0)(−3) = 0 S is an orthogonal set.
rv1 = 52 + 02 =5, andrv2 = 02 + (−3)2 =3.
6Rev.F09
How to Construct an Orthonormal Set of Vectorsfrom an Orthogonal Set of Vectors? (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
Step 3: Normalize the vectors in the orthogonal set.
Step 4: Verify that the set S is orthonormal by showing that and
rq1 =
rv1rv1=55,05
⎛⎝⎜
⎞⎠⎟=(1,0),
rq2 =rv2rv2=03,−33
⎛⎝⎜
⎞⎠⎟=(0,−1)
rq1 =
rq2 =1. ⟨rq1,rq2⟩= 0
⟨rq1,rq2⟩=
rq1 ⋅rq2 = (1)(0) + (0)(−1) = 0,
rq1 = ⟨rq1,rq1⟩
12 = (rq1 ⋅
rq1)12 = 12 + 02 = 1, and
rq2 = ⟨rq2 ,rq2⟩
12 = (rq2 ⋅
rq2 )12 = 02 + (−1)2 = 1.
7Rev.F09
Orthonormal Set, Orthonormal Basis, and Orthogonal Basis
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
Orthonormal Set: An orthogonal set in which each vector is a unit vector.
Orthonormal basis: A basis consisting of orthonormal vectors in an inner product space. Example: The standard basis for
ⁿ.ℜ
Orthogonal basis: A basis consisting of orthogonal vectors in an inner product space.
Note that if S is an orthogonal set, then S is a linearly independent set.
8Rev.F09
Orthonormal Set, Orthonormal Basis, and Orthogonal Basis (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
If S ={v1, v2 , … , vn} is an orthogonal basis of W, then for any w ∈ W,
where
are called the Fourier coefficients.So the coordinate vector of w,
rw = ⟨
rw, rvi ⟩⟨rvi,rvi ⟩i=1
n
∑ rvi =⟨rw, rv1⟩⟨rv1,rv1⟩rv1 +
⟨rw, rv2 ⟩⟨rv2 ,rv2 ⟩rv2 + ...
⟨rw, rvn ⟩⟨rvn,rvn ⟩rvn,
rwS =(
rw)S =⟨rw, rv1⟩⟨rv1,rv1⟩,⟨rw, rv2 ⟩⟨rv2 ,rv2 ⟩, ... ,
⟨rw, rvn ⟩⟨rvn ,rvn ⟩
⎛⎝⎜
⎞⎠⎟.
⟨rw, rv1⟩
⟨rv1,rv1⟩,⟨rw, rv2⟩
⟨rv2 ,rv2⟩, ... ,
⟨rw, rvn⟩
⟨rvn ,rvn⟩
9Rev.F09
Orthonormal Set, Orthonormal Basis, and Orthogonal Basis (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
If S ={q1, q2 , … , qn} is an orthonormal basis of W, then for any w ∈ W,
where are called the Fourier coefficients.
So the coordinate vector of w,
rw = ⟨rw, rqi ⟩⟨rqi,rqi ⟩i=1
n
∑ rqi = ⟨rw, rqi ⟩
i=1
n
∑ rqi
=⟨rw, rq1⟩
rq1 + ⟨rw, rq2 ⟩
rq2 + ... ⟨rw, rqn ⟩
rqn,
as ⟨rqi,rqi ⟩ =
rqi2 =1 for i =1,2, ... ,n.
rwS =(
rw)S =(⟨rw, rq1⟩, ⟨
rw, rq2 ⟩, ... , ⟨rw, rqn ⟩).
⟨rw, rq1⟩,⟨
rw, rq2⟩, ... ,⟨rw, rqn⟩
10
Rev.F09
How to Find the Coordinate Vector with Respect to a Given Orthogonal Basis?
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
Example 2: Compute the coefficients and determine the coordinate vectors in Example 1 for u = (10,3).
From Example 1, we have v1 = (5,0), v2 = (0,-3) and
In this case, the coefficients are:
⟨ru, rv1⟩
⟨rv1,rv1⟩=ru ⋅ rv1rv1
2 =(10)(5) + (3)(0)
52=5025= 2
⟨ru, rv2⟩
⟨rv2 ,rv2⟩=ru ⋅ rv2rv2
2 =(10)(0) + (3)(−3)
32=−99= −1
rv1 =5, and
rv2 =3.
11
Rev.F09
How to Find the Coordinate Vector with Respect to a Given Orthogonal Basis? (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
So the coordinate vector of u,
We can see that a nice advantage of working with an orthogonal basis is that the coefficients in any basis representation for a vector are immediately known; they are called Fourier coefficients.
ruS =(
ru)S =⟨ru, rv1⟩⟨rv1,rv1⟩,⟨ru, rv2 ⟩⟨rv2 ,rv2 ⟩
⎛⎝⎜
⎞⎠⎟=(2,−1).
12
Rev.F09
How to Find the Coordinate Vector with Respect to a Given Orthonormal Basis?
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
Example 3: Find the coordinates of w = (2,3) relative to the orthonormal basis for ², B = {ℜ v1, v2}, where
Since B is orthonormal, we have
rv1 =(−
12,−12), and rv2 =(
12,−12).
⟨rw, rv1⟩=
rw ⋅ rv1 = (2,3) ⋅(−12,−12) = (2)(−
12) + 3(−
12) = −
52,
⟨rw, rv2⟩=
rw ⋅ rv2 = (2,3) ⋅(12,−12) = (2)(
12) + 3(−
12) = −
12,
and rwB = (rw)B = (⟨
rw, rv1⟩,⟨rw, rv2⟩) = (−
52,−12).
13
Rev.F09
The Gram-Schmidt Process
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
Let S = {u1, u2, …, um} with nonzero ui ∈ ⁿ for i = 1, 2, … , ℜm. S does not have to be a linearly independent set. It might be that A = [u1 u2 … um] is an n x m matrix and the source of S.
The Gram-Schmidt Algorithm: S = {u1, u2, …, um} • Let • For k = 2, 3, … , m, let
If we discard it since it is linearly dependent.
rvk =
ruk −⟨ruk,rvi ⟩
⟨rvi,rvi ⟩i=1
k−1
∑ rvi.
rv1 =
ru1.
rvk =
r0,
14
Rev.F09
The Gram-Schmidt Process (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
3. We then have r ≤ m orthogonal and linearly independent vectors in B = {v1, v2, …, vr}.
span(S) = span(B) = W, a r dimensional subspace of ⁿ ℜand B is an orthogonal basis for W. W = col(A) if A = [u1 u2 … um].
An orthogonal basis for W is {q1, q2, …, qr} where
for i =1, 2, … , r. rqi =
rvirvi
15
Rev.F09
How to Construct an Orthonormal Basis from a Nonstandard Basis in ⁿ ℜ using the Gram-Schmidt Process?
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
Let ⁿℜ be the usual Euclidean inner product space of dimension n. Let {u1, u2, …, un} be a nonstandard basis in
ⁿ.ℜ
Step 1: Use the Gram-Schmidt method to construct an orthogonal basis {v1, v2, …, vn} from the basis vectors {u1, u2, …, un}.
Step 2: Normalize the orthogonal basis vectors to obtain the orthonormal basis {q1, q2,…, qn}.
16
Rev.F09
How to Construct an Orthonormal Basis from a Nonstandard Basis in ⁿ ℜ using the Gram-Schmidt Process? (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
Example 4: The set is a nonstandard basis
in ³.ℜ Step 1:
{ru1,ru2 ,ru3}={(−1,1,0),(1,2,1),(3,1,1)}
rv1 =ru1 =(−1,1,0),
rv12 =2, W1 =span({
rv1}), dim (W1)=1.
Projectionsonto W1 have one component.rv2 =
ru2 −projW1 (ru2 )=(1,2,1)−
⟨ru2 ,rv1⟩
rv12
rv1
=(1,2,1)−12(−1,1,0)=(
32,32,1),
rv22 =112, W2 =span({
rv1,rv2}) dim (W2 )=2.
Projectionsonto W2 have two components.
17
Rev.F09
How to Construct an Orthonormal Basis from a Nonstandard Basis in ⁿ ℜ using the Gram-Schmidt Process? (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
form an orthogonal basis for ³. ℜ
rv3 =ru3 −projW2 (
ru3)=(2,1,1)−⟨ru3,rv1⟩
rv12
rv1 −⟨ru3,rv2 ⟩
rv22
rv2
=(3,1,1)−(−2)2(−1,1,0)−
711 / 2
(32,32,1)
=(3,1,1)+ (−1,1,0)−1411(32,32,1)
=(3,1,1)+ (−1,1,0)−(2111,2111,1411)=(
111,111,−311).
Thus, rv1 =(−1,1,0),rv2 =(
32,32,1),
rv3 =(111,111,−311).
18
Rev.F09
How to Construct an Orthonormal Basis from a Nonstandard Basis in ⁿ ℜ using the Gram-Schmidt Process? (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
Step 2: Normalize the orthogonal basis vectors to obtain the orthonormal basis B = {q1, q2, q3}.
The norms of these vectors are:
So an orthonormal basis is B where rv1 = 2,
rv2 =112, and rv3 =
111.
rq1 =rv1rv1=12(−1,1,0)=(−
12,12,0),
rq2 =rv2rv2=
211(32,32,1)=(
3 22 11
,3 22 11
,211), and
rq3 =rv3rv3= 11(
111,111,−311)=(
1111,1111,−3 1111
).
19
Rev.F09
How to Find the Least Squares Solution?
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
The linear system Ax = b always has the associated normal system ATAx = ATb which is consistent and has one or more solutions. Any solution of this system is a least squares solution of Ax = b. Moreover, the orthogonal projection of b on W = col(A) is where x is a least squares solution.
Example 5: Find the least squares solution of the linear system Ax = b given by
Observe that A has two linearly independent column vectors, so ATA is invertible, and there is a unique solution to ATAx = ATb, which will be our least squares solution to Ax = b.
A =
1−21
0−14
⎡
⎣
⎢⎢⎢
⎤
⎦
⎥⎥⎥and
rb =
−1−27
⎡
⎣
⎢⎢⎢
⎤
⎦
⎥⎥⎥.
Arx =projW (
rb),
20
Rev.F09
How to Find the Least Squares Solution? (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
We have
So the normal system ATAx = ATb in this case is
ATA =1 −2 1
0 −1 4
⎡
⎣⎢⎢
⎤
⎦⎥⎥
1−21
0−14
⎡
⎣
⎢⎢⎢
⎤
⎦
⎥⎥⎥= 6 6
6 17⎡⎣⎢
⎤⎦⎥
ATrb =
1 −2 1
0 −1 4
⎡
⎣⎢⎢
⎤
⎦⎥⎥−1−27
⎡
⎣
⎢⎢⎢
⎤
⎦
⎥⎥⎥= 10
30⎡⎣⎢
⎤⎦⎥
6 66 17
⎡⎣⎢
⎤⎦⎥
x1x2
⎡
⎣⎢⎢
⎤
⎦⎥⎥= 10
30⎡⎣⎢
⎤⎦⎥
21
Rev.F09
How to Find the Least Squares Solution? (Cont.)
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
By Gauss Elimination, the row-echelon form of
is
The solution to the normal system ATAx = ATb in this case is
is our unique least square solution to Ax = b.
ATA | AT
rb⎡⎣ ⎤⎦=
6 66 17
1030
⎡⎣⎢
⎤⎦⎥
x1 =−5 / 33, and x2 =20 /11.
1 10 1
−5 / 3320 /11
⎡⎣⎢
⎤⎦⎥.
x1x2
⎡
⎣⎢⎢
⎤
⎦⎥⎥=rx =(x1, x2 )=(−5 / 33,20 /11)
22
Rev.F09
How to Find the Orthogonal Projection and Obtain the Best Approximation?
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
is the orthogonal projection of b on W = col(A). The is the Best Approximation to b in W since the distance between b and is the minimum for all vectors in W,
Example 6: From the previous example, the orthogonal projection of b on W = col(A) is
Thus, we have obtained the best approximation to b in W. projW (
rb)=Arx =
1−21
0−14
⎡
⎣
⎢⎢⎢
⎤
⎦
⎥⎥⎥−5 / 3320 /11
⎡⎣⎢
⎤⎦⎥=
−5 / 33−50 / 33235 / 33
⎡
⎣
⎢⎢⎢
⎤
⎦
⎥⎥⎥.
Arx =projW (
rb)
rb −projW (
rb) ≤
rb −rw for all rw ∈W .
projW (rb)
projW (rb)
23
Rev.F09
What have we learned?
We have learned to :• Construct an orthonormal set of vectors from an
orthogonal set of vectors.• Find the coordinate vector with respect to a given
orthonormal basis. • Construct an orthogonal basis from a nonstandard basis in
ⁿℜ using the Gram-Schmidt process.• Find the least squares solution to a linear system Ax = b. • Find the orthogonal projection on col(A).• Obtain the best approximation.
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.
24
Rev.F09
CreditSome of these slides have been adapted/modified in part/whole from the following textbook:• Anton, Howard: Elementary Linear Algebra with Applications, 9th Edition
http://faculty.valenciacc.edu/ashaw/ Click link to download other modules.