classical regression

21
Lecture 9 1 Econ 140 Econ 140 Classical Regression Lecture 8

Upload: fell

Post on 09-Feb-2016

41 views

Category:

Documents


0 download

DESCRIPTION

Classical Regression. Lecture 8. Today’s Plan. For the next few lectures we’ll be talking about the classical regression model Looking at both estimators for a and b Inferences on what a and b actually tell us Today: how to operationalize the model - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Classical Regression

Lecture 9 1

Econ 140Econ 140

Classical Regression

Lecture 8

Page 2: Classical Regression

Lecture 9 2

Econ 140Econ 140Today’s Plan

• For the next few lectures we’ll be talking about the classical regression model– Looking at both estimators for a and b– Inferences on what a and b actually tell us

• Today: how to operationalize the model– Looking at BLUE for bi-variate model– Inference and hypothesis tests using the t, F, and 2

– Examples of linear regressions using Excel

Page 3: Classical Regression

Lecture 9 3

Econ 140Econ 140Estimating coefficients

• Our model: Y = a + bX + e• Two things to keep in mind about this model:

1) It is linear in both variables and parameters• Examples of non-linearity in variables:

Y = a + bX2 or Y = a + bex

• Example of non-linearity in parameters: Y = a + b2X

• OLS can cope with non-linearity in variables but not in parameters

Page 4: Classical Regression

Lecture 9 4

Econ 140Econ 140Estimating coefficients (3)

2) Notation: we’re not estimating a and b anymore• We are estimating coefficients which are estimates

of the parameters of a and b• We will denote the coefficients as or and or a b

• We are dealing with a sample size of n– For each sample we will get a different and pair

Page 5: Classical Regression

Lecture 9 5

Econ 140Econ 140Estimating coefficients (4)

• In the same way that you can take a sample to get an estimate of µy you can take a sample to get an estimate of the regression line, of and

Y

X

Sample 2

Sample 1

Regression (Population)

Line

Page 6: Classical Regression

Lecture 9 6

Econ 140Econ 140The independent variable

• We also have a given variable X, its values are known– This is called the independent variable

• Again, the expectation of Y given X is E(Y | X) = a + bX

• With constant variance V(Y) = 2

Page 7: Classical Regression

Lecture 9 7

Econ 140Econ 140A graph of the model

(Y1, X1)

Y

Y

XY ˆ

YYe ˆ

Y

X

Page 8: Classical Regression

Lecture 9 8

Econ 140Econ 140What does the error term do?

• The error term gives us the test statistics and tells us how well the model Y = a+bX+e fits the data

YYe ˆ

• The error term represents:1) Given that Y is a random variable, e is also

random, since e is a function of Y2) Variables not included in the model3) Random behavior of people4) Measurement error5) Enables a model to remain parsimonious - you

don’t want all possible variables in the model if some have little or no influence

Page 9: Classical Regression

Lecture 9 9

Econ 140Econ 140Rewriting beta

• Our complete model is Y = a + bX + e• We will never know the true value of the error e so we will

estimate the following equation: XY

• For our known values of X we have estimates of , , and

• So how do we know that our OLS estimators give us the BLUE estimate?– To determine this we want to know the expected value of as an

estimator of b, which is the population parameter

Page 10: Classical Regression

Lecture 9 10

Econ 140Econ 140Rewriting beta(2)

• To operationalize, we want to think of what we know• We know from lecture 2 that there should be no correlation

between the errors and the independent variables

0,00,0

iiii

iiXEX

E

• We also know 2)( XV 2)( YV

• Now we have that E(Y|X) = a + bX + E(|X)• The variance of Y given X is V(Y) = 2 so V(|X)= 2

Page 11: Classical Regression

Lecture 9 11

Econ 140Econ 140Rewriting beta(3)

• Rewriting – In lecture 2 we found the following estimator for

22 XnX

YXnXY

• Using some definitions we can show: E() = b

2

2)(

xV

Page 12: Classical Regression

Lecture 9 12

Econ 140Econ 140Rewriting beta (4)

• We have definitions that we can use:

• Using the definitions for yi and xi we can rewrite as

2x

xy

)( YYy ii

)( XXx ii 22 )( XXx iiSo that

• We can also write xYxy

Page 13: Classical Regression

Lecture 9 13

Econ 140Econ 140Rewriting beta (5)

• We can rewrite as where iiYc

2i

ii

x

xc

• The properties of ci :

2i

22

i

i

i

1c .4

1c 3.

1c 2. ,0)( ,0c 1.

ixiXix

i

i

i

i

X

x

xXX

Page 14: Classical Regression

Lecture 9 14

Econ 140Econ 140Showing unbiasedness

• What do we know about the expected value of beta?

iiYc

)()( ii YEcE

• We can rewrite this as )()( ii bXacE

• Multiplying the brackets out we get:

)()()( iii bXcacE

• Since b is constant, iii XcbacE )()(

Page 15: Classical Regression

Lecture 9 15

Econ 140Econ 140Showing unbiasedness (2)

• Looking back at the properties for ci we know that

,0ic 1ii Xc

• Now we can write this as bbE )1(0)(

• We can conclude that the expected value of is b and that is an unbiased estimator of b

Page 16: Classical Regression

Lecture 9 16

Econ 140Econ 140Gauss Markov Theorem

• We can now ask: is an efficient estimator?• The variance of is

2

22 )()(

ii

xYVcV

Where iiYc

• How do we know that OLS is the most efficient estimator?– The Gauss-Markov Theorem

Page 17: Classical Regression

Lecture 9 17

Econ 140Econ 140Gauss Markov Theorem (2)

• Similar to our proof on the estimator for y.• Suppose we use a new weight iii dcc

• We can take the expected value of E()

)()()(

)(

iiiiii

ii

ii

XdXcbdcabXac

YEc

Page 18: Classical Regression

Lecture 9 18

Econ 140Econ 140Gauss Markov Theorem (3)

• We know that

iii

iiiXdbbda

XdbdaE )1()0()(

– For to be unbiased, the following must be true:0 id

0 ii Xd

Page 19: Classical Regression

Lecture 9 19

Econ 140Econ 140Gauss Markov Theorem (4)

• Efficiency (best)? • We have where iiYc iii dcc

• Therefore the variance of this new is

• + di2 +2cidi

• If each di 0 such that ci c’i then

02 id

• So when we use weights c’I we have an inefficient estimator

2

22 )()(

ii

xYVcV

Page 20: Classical Regression

Lecture 9 20

Econ 140Econ 140Gauss Markov Theorem (5)

• We can conclude that

• is BLUE 22 XnX

YXnXY

Page 21: Classical Regression

Lecture 9 21

Econ 140Econ 140Wrap up

• What did we cover today?• Introduced the classical linear regression model (CLRM)• Assumptions under the CLRM

1) Xi is nonrandom (it’s given)

2) E(ei) = E(ei|Xi) = 0

3) V(ei)= V(ei|Xi) = 2

4) Covariance (eiej) = 0• Talked about estimating coefficients• Defined the properties of the error term• Proof by contradiction for the Gauss Markov Theorem