lecture 18 notes

13
MTH 309 Y LECTURE 18.1 TUE 2014.04.08 Recall: 1) A least square solution of a matrix equation Ax = b is a solution of the equation Ax = proj Col(A) b 3) If {w 1 w } is an orthogonal basis of a subspace V R then for u R we have proj V u = u · w 1 w 1 · w 1 w 1 + u · w 2 w 2 · w 2 w 2 + ··· + u · w w · w w 4) If {v 1 v } is any basis of a subspace V R then using the Gram-Schmidt process we can compute an orthogonal basis of V . 2) If Ax = b is a consistent equation then b Col(A), and so proj Col(A) b = b. In such case we have: least square solutions of Ax = b = the usual solutions of Ax = b If Ax = b is inconsistent then least square solutions are the best possible substitute for (nonexistent) solutions of this equa- tion.

Upload: ali-hasan

Post on 08-Dec-2015

222 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Lecture 18 Notes

MTH 309 Y LECTURE 18.1 TUE 2014.04.08

Recall:1) A least square solution of a matrix equation Ax = b is a solution

of the equation

Ax = proj

Col(A)

b

3) If {w1

� � � � � w�} is an orthogonal basis of a subspace V ⊆ R�then for u ∈ R�

we have

projV u =

� u · w1

w1

· w1

w1

+

� u · w2

w2

· w2

w2

+ · · · +

� u · w�w� · w�

w�

4) If {v1

� � � � � v�} is any basis of a subspace V ⊆ R�then using

the Gram-Schmidt process we can compute an orthogonal basis

of V .

2) If Ax = b is a consistent equation then b ∈ Col(A), and so

proj

Col(A)

b = b. In such case we have:

least square solutions

of Ax = b

=

the usual solutions

of Ax = b

If Ax = b is inconsistent then least square solutions are the

best possible substitute for (nonexistent) solutions of this equa-

tion.

Page 2: Lecture 18 Notes

MTH 309 Y LECTURE 18.2 TUE 2014.04.08

How to compute least square solutions of Ax = b(version 1.0)¿ Calculate a basis for Col(A).

¡ Use the Gram-Schmidt process to compute an orthogonal

basis of Col(A).

¬ Use the orthogonal basis to compute proj

Col(A)

b.

√ Compute solutions of the equation Ax = proj

Col(A)

b.

Next goal: Simplify this.

Page 3: Lecture 18 Notes

MTH 309 Y LECTURE 18.3 TUE 2014.04.08

Definition. If V is the a subspace of R�

then the orthogonal

complement of V is the set V

of all vectors orthogonal to V :

V

= {w ∈ R�

| w · v = 0 for all v ∈ V }

V ⊥

V

Proposition. If V is a subspace of R�then V ⊥

is also a subspace

of R�.

Page 4: Lecture 18 Notes

MTH 309 Y LECTURE 18.4 TUE 2014.04.08

Recall: If A is an � × � matrix then Row(A) is the subspace of

R�spanned by the rows of A.

Proposition. If A is an � × � matrix then

Row(A)

⊥= Nul(A)

Page 5: Lecture 18 Notes

MTH 309 Y LECTURE 18.5 TUE 2014.04.08

Corollary. If A is an � × � matrix then

Col(A)

⊥= Nul(AT

)

Page 6: Lecture 18 Notes

MTH 309 Y LECTURE 18.6 TUE 2014.04.08

Definition. The equation

(A

T

A)x = A

T

b

is called the normal equation of Ax = b.

Back to least square solutions

Theorem. A vector x̂ is a least square solution of the equation

Ax = b

iff x̂ is an ordinary solution of the equation

(AT A)x = ATb

Page 7: Lecture 18 Notes

Example. Compute least square solutions of the following equation:

1 1

0 2

0 0

⎦ ·� �

1

�2

=

1

2

3

MTH 309 Y LECTURE 18.7 TUE 2014.04.08

How to compute least square solutions of Ax = b(version 2.0)¿ Compute AT A, AT

b.

¡ Solve the normal equation (AT A)x = ATb.

Page 8: Lecture 18 Notes

MTH 309 Y LECTURE 18.8 TUE 2014.04.08

Application: least square lines

Definition. If (�1

� �1

)� � � � � (��� ��) are points on the plane then

the least square line for these points is the line given by the

equation

� (�) = �� + �such that the number

dist

�1

.

.

.

��

⎦ �⎡

� (�1

)

.

.

.

� (��)

=

(�1

− � (�1

))

2

+ � � � (�� − � (��))

2

is as small as possible.

� (�) = �� + �

�1 �2 �3 �4

�4

� (�4)� (�3)

�3

�2

�1

� (�1)� (�2)

Page 9: Lecture 18 Notes

MTH 309 Y LECTURE 18.9 TUE 2014.04.08

Proposition. The line � (�) = �� + � is the least square line for

the points (�1

� �1

)� � � � � (��� ��) if the vector

���

is a least square

solution of the equation

�1

1

.

.

.

.

.

.

�� 1

⎦ ·� �

1

�2

=

�1

.

.

.

��

Upshot:

Page 10: Lecture 18 Notes

MTH 309 Y LECTURE 18.10 TUE 2014.04.08

Example. Find the equation of the least square line for the points

(0� 0)� (1� 1)� (3� 1)� (5� 3).

Page 11: Lecture 18 Notes

Definition. If (�

1

� �

1

)� � � � � (�

� �

) are points on the plane then

the least square parabola for these points is the parabola given

by the equation

� (�) = ��

2

+ �� + �

such that the number

dist

1

.

.

.

� (�

1

)

.

.

.

� (�

)

=

(�

1

− � (�

1

))

2

+ � � � (�

− � (�

))

2

is as small as possible.

MTH 309 Y LECTURE 18.11 TUE 2014.04.08

Least square curvesThe above procedure can be used to determine curves other than

lines that fit a set of points in the least square sense.

Example: least square parabolas.

(�1� � (�1))

(�1� �1)

� (�)= ��2 + �� + �

Page 12: Lecture 18 Notes

MTH 309 Y LECTURE 18.12 TUE 2014.04.08

Proposition. The parabola � (�) = ��2

+�� +� is the least square

parabola for the points (�1

� �1

)� � � � � (��� ��) if the vector

���

is

a least square solution of the equation

�2

1

�1

1

.

.

.

.

.

.

.

.

.

�2� �� 1

⎦ ·⎡

�1

�2

�3

=

�1

.

.

.

��

Upshot:

Page 13: Lecture 18 Notes

MTH 309 Y LECTURE 18.13 TUE 2014.04.08

Example. Find the least square parabola � (�) = ��2

+�� +� for the

following set of points:

(−2� 2)� (0� 0)� (1� 1)� (2� 3)