simple linear regression and the method of least...

18
Simple Linear Regression and the Method of Least Squares Author: Nicholas G Reich, JeGoldsmith This material is part of the statsTeachR project Made available under the Creative Commons Attribution-ShareAlike 3.0 Unported License: http://creativecommons.org/licenses/by-sa/3.0/deed.en US

Upload: others

Post on 21-Aug-2020

10 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Simple Linear Regression and theMethod of Least Squares

Author: Nicholas G Reich, Jeff Goldsmith

This material is part of the statsTeachR project

Made available under the Creative Commons Attribution-ShareAlike 3.0 UnportedLicense: http://creativecommons.org/licenses/by-sa/3.0/deed.en US

Page 2: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Figure acknowledgements to Hadley Wickham.

Page 3: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Which data show a stronger association?

1 2

0

10

20

30

0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00x

y

Page 4: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Goals for this class

You should be able to...

! interpret regression coefficients.

! derive estimators for SLR coefficients.

! implement a SLR from scratch (i.e. not using lm()).

! explain why some points have more influence than others onthe fitted line.

Page 5: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Regression modeling

! Want to use predictors to learn about the outcomedistribution, particularly conditional expected value.

! Formulate the problem parametrically

E (y | x) = f (x ;β) = β0 + β1x1 + β2x2 + . . .

! (Note that other useful quantities, like covariance andcorrelation, tell you about the joint distribution of y and x)

Page 6: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Brief Detour: Covariance and Correlation

cov(x , y) = E [(x − µx)(y − µy )]

cor(x , y) =cov(x , y)!var(x)var(y)

Page 7: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Simple linear regression

! Linear models are a special case of all regression models;simple linear regression is the simplest place to start

! Only one predictor:

E (y | x) = f (x ;β) = β0 + β1x1

! Useful to note that x0 = 1 (implicit definition)

! Somehow, estimate β0,β1 using observed data.

Page 8: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Coefficient interpretation

Page 9: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Coefficient interpretation

Page 10: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Step 1: Always look at the data!

! Plot the data using, e.g. the plot() or qplot() functions

! Do the data look like the assumed model?

! Should you be concerned about outliers?

! Define what you expect to see before fitting any model.

Page 11: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

1 2

3 4

−2

−1

0

1

2

−7.5

−5.0

−2.5

0.0

2.5

0

10

20

30

0

10

20

−2 −1 0 1 2 3 −2 −1 0 1 2 3

−2 −1 0 1 2 3 0 1 2 3 4x

y

Page 12: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Least squares estimation

! Observe data (yi , xi ) for subjects 1, . . . , I . Want to estimateβ0,β1 in the model

yi = β0 + β1xi + εi ; εiiid∼ (0,σ2)

! Recall the assumptions:

! A2: E (ε | x) = E (ε) = 0! A3: Uncorrelated errors! A4: Constant variance! A5: Normal distribution not needed for least squares, but is

needed for inference.]

Page 13: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Circle of Life

Population Sample

Sample StatisticPopulation Parameter

Page 14: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Least squares estimation

! Recall that for a single sample yi , i ∈ 1, . . . ,N, the samplemean µy minimizes the sum of squared deviations.

RSS(µy ) =N"

i=1

(yi − µy )2

Page 15: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Least squares estimation

Find β0 and β1. By minimizing RSS relative to each parameter.

RSS(β0,β1) =N"

i=1

(yi − E[yi |xi ])2

We obtain

β0 = b0 = y − b1x

β1 = b1 =

#(xi − x)(yi − y)#

(xi − x)2

Page 16: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Notes about LSE

Relationship between correlation and slope

ρ =cov(x , y)!var(x)var(y)

; β1 =cov(x , y)

var(x)

Why we need to keep watch for outliers

β1 =

#(yi − y)(xi − x)#

(xi − x)2

=

# yi−yxi−x (xi − x)2#

(xi − x)2

=" yi − y

xi − xωi

Note that weight ωi increases as xi gets further away from x .

Page 17: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Geometric interpretation of least squares

Least squares minimizes the sum of squared vertical distancesbetween observed and estimated y ’s:

minβ0,β1

I"

i=1

(yi − (β0 + β1xi ))2

Page 18: Simple Linear Regression and the Method of Least Squaresnickreich.github.io/methods2/assets/lectures/class3_SLRBasics_annotated.pdf · Simple Linear Regression and the Method of Least

Least squares foreshadowing

! Didn’t have to choose to minimize squares – could minimizeabsolute value, for instance.

! Least squares estimates turn out to be a “good idea” –unbiased, BLUE.

! Later we’ll see about maximum likelihood as well.