# introductory econometrics: chapter 2 slides

Post on 06-Jan-2016

248 views

Embed Size (px)

DESCRIPTION

Slide summaries of chapter 2TRANSCRIPT

REVISION MULTIPLE REGRESSION

Econometrics I - Exercise 2

Lubomr Cingl

March 4, 2014

REVISION MULTIPLE REGRESSION

REVISION

I Simple regression modelI

y = 0 + 1x + u (1)

I y dependent / explainedI x independent / explanatory / controlI u error termI 0 interceptI 1 slopeI we want to know the relationship

REVISION MULTIPLE REGRESSION

ASSUMPTIONS

I Average value of u in population is 0: E(u) = 0I not very restrictive

I Zero conditional meanI knowing sth about x does not give any info about uI E(u|x) = E(u) = 0 which implies E(y|x) = 0 + 1x

REVISION MULTIPLE REGRESSION

E(y|x) as a linear function of x;distribution of y centered aroundexp. value

REVISION MULTIPLE REGRESSION

ESTIMATION FROM A SAMPLE

I we have a random sample of observationsI for each observation holds:I Sample regression lineI

yi = 0 + 1xi + ui (2)

I we want best estimates of parametersI 0, 1I 3 ways how to find them: MoM, OLS, ML

REVISION MULTIPLE REGRESSION

FORMULAS TO KNOW

I intercept0 = y 1x (3)

I slope

1 =

ni=1

(xi x)(yi y)n

i=1(xi x)2

(4)

ifn

i=1(xi x)2 > 0

I 1 is sample covar bw x and y div by variance of xI If x and y are positively correlated, the slope positive

REVISION MULTIPLE REGRESSION

SAMPLE REGRESSION LINE

I e.g. inc = 0.33 + 0.56edu

REVISION MULTIPLE REGRESSION

R SQUARED

I each observation can be made up by explained andunexplained part

I yi = yi + uiI we can define following:I

(yi y)2 is total sum of squares (Var of y)I

(yi y)2 explained sum of squaresI

(ui)2 residual sum of squaresI SST = SSE + SSRI R2 = SSE/SST = 1 SSR/SST

REVISION MULTIPLE REGRESSION

PROPERTIES OF OLS ESTIMATOR

I UnbiasedI expected value of estimator is its true valueI 1 = 1 +

(xix)ui(xix)2 = 1

I VarianceI Assume homoskedasticity Var(u|x) = 2I 2 is the error varianceI Var1 = 2/

(xi x) = 2/sx2

I We also have to estimate 2I 2 = 1/(n 2) u12 = SSR/(n 2)I Var1 = 2/

(xi x) = 1/(n 2)

u1

2/sx2

REVISION MULTIPLE REGRESSION

EXAMPLE 2.7

Consider the savings function

sav = 0 + 1inc + u,u =

inc e (5)

where e is a random variable with E(e) = 0 and Var(e) = 2e .Assume that e is independent of inc.

1. Show that E(u|inc) = 0, so that the key zero conditionalmean assumption is satisfied. [Hint: if e independent ofinc, then E(e|inc) = E(e) and Var(e|inc) = Var(e)]

2. Show that Var(u|inc) = 2e inc.3. Provide a discussion that supports the assumption that the

variance of savings increases with family income.

REVISION MULTIPLE REGRESSION

INTERPRETATION

linear functionI y = 0 + 1xI marginal effect of x on y

natural logI y = log(x); x > 0I good to know: log(1 + x) x for x 0 andI log(x1) log(x0) (x1 x0)/x0 = x/x0I therefore log(x) 100 %x

constant elasticity modelI log(y) = 0 + 1log(x); y, x > 0I 1 is elasticity of y w.r.t. x

REVISION MULTIPLE REGRESSION

EXAMPLE 2.6

Using data from 1988 for houses sold close to garbage mill,following equation relates housing price to distance fromgarbage incinerator:

log( price) = 9.4 + 0.312log(dist); n = 135; R2 = 0.162 (6)

1. Interpret the coefficient on log(dist). Is the sign of thisestimate what you expect it to be?

2. Do you think simple regression provides an unbiasedestimator of the ceteris paribus elasticity of price withrespect to dist? (Think about the citys decision on where toput the incinerator.)

3. What other factors about a house affect its price? Mightthese be correlated with distance from the incinerator?

REVISION MULTIPLE REGRESSION

REGRESSION THROUGH ORIGIN

restriction: x = 0 if y = 0I then we estimate y = 1xI we know that

1 =

ni=1

(xi x)(yi y)n

i=1(xi x)2

(7)

if we set x = 0, y = 0 we get possibly biased estimator:

1 =

ni=1

(xi)(yi)

ni=1

(xi)2(8)

REVISION MULTIPLE REGRESSION

EXAMPLE 2.8

Consider standard simple regression model with standardassumptions met. Let 1 be the estimator of 1 obtained byassuming the intercept is zero.

1. Find E(1) in terms of the xi, 0, and 1. Verify that 1 isunbiased for 1 when the population intercept 0 is zero.Are there other cases where 1 is unbiased?

2. Find the variance of 1. Hint: It does not depend on 0.3. Show that Var(1) Var(1). Hint:

(xi)2

(xi x)2

with strict inequality unless x = 0.4. Comment on the trade-off between bias and variance

when choosing between 1 and 1.

REVISION MULTIPLE REGRESSION

EXAMPLE 2.9

1. Let 0 and 1 be the intercept and slope from the regressionof yi on xi, using n observations. Let c1 and c2, with c2 6= 0,be constants. Let 0 and 1 be the intercept and slope fromthe regression c1yi on c2xi. Show that 1 = (c1/c2)1 and0 = c10, thereby verifying the claims on units ofmeasurement in Section 2.4. [Hint: To obtain 1, 0 plugthe scaled versions of x and y into the OLS formulas.]

2. Now let and 0 and 1 be from the regression (c1 + yi) on(c2 + xi) (with no restriction on c1 or c2). Show that 1 = 1and 0 = 0 + c1 c21.

REVISION MULTIPLE REGRESSION

EXAMPLE 2.9

3. Now let and 0 and 1 be the OLS estimates from theregression log(yi) on (xi), y > 0. For c1 > 0 let 0 and 1 be theintercept and slope from the regression of log(c1yi) on (xi).Show that 1 = 1 and 0 = 0 + log(c1).4. Now, assuming that xi > 0, let 0 and 1 be the intercept andslope from the regression of yi on log(c2xi). How do 0 and 1compare with the intercept and slope from the regression of yion log(xi)?

REVISION MULTIPLE REGRESSION

Multiple regression

REVISION MULTIPLE REGRESSION

MULTIPLE REGRESSION MODEL

I paralels with simple reg model

y = 0 + 1x1 + 2x2 + ...+ u (9)

I y dependent / explainedI x independent / explanatory / controlI u error termI similar assumptions about the error termI 0 interceptI linear - linear in parametersI 1,2, ... slopesI Interpretation: y = 1x1I ceteris paribus: holding other factors constant

REVISION MULTIPLE REGRESSION

EXAMPLE I

I effect of education on wage

wage = 0 + 1educ + 2exper + u (10)

I we take exper out from error termI ceteris paribus: holding experience constant, we can

measure the effect of education on wageI partial effect

REVISION MULTIPLE REGRESSION

EXAMPLE II

I effect of income on consumption

cons = 0 + 1inc + 2(inc)2 + u (11)

I same way of getting parametersI interpretation different: alone inc non-sense

consinc

1 + 22inc

I marginal effect of income on consumption depends onboth terms

REVISION MULTIPLE REGRESSION

ASSUMPTIONS

I Expected value of u is 0: E(u) = 0I Zero conditional mean (ASS3)

I knowing sth about x does not give any info about uI E(u|x1, x2, ..., xk) = 0I all unobserved factors in error term uncorrelated with all xi

I No perfect collinearity (ASS4)I no independent variable is constantI no exact linear relationship among them holdsI xs can be correlated, but not perfectly

I we need more observations than parameters: n k + 1

REVISION MULTIPLE REGRESSION

ESTIMATION FROM A SAMPLE

I we have a random sample of observations (ASS2)I for each observation holds (ASS1):

yi = 0 + 1xi1 + 2xi2 + ...+ kxik + ui (12)

I we want best estimates of parametersI 0, 1, ...I 3 ways how to find them: MoM, OLS, MLI we will get the estimates Sample regression ft

yi = 0 + 1xi2 + 2xi2 + ...+ kxik (13)

REVISION MULTIPLE REGRESSION

PROPERTIES

I if ASS1 thru ASS4 hold, then our OLS estimator isunbiased

I i.e. the procedure of getting the estimate is unbiased

I if we add irrelevant variableI no effect on parameters of relevant variables - still unbiasedI slight increase in R2

I if we omit important variableI OLS will be biased, usuallyI omitted variable bias

REVISION MULTIPLE REGRESSION

OMMITED VARIABLE BIAS

y = 0 + 1x1 + 2x2 + u

y = 0 + 1x1

E(1) = 1 + 2

(xi1 x1)xi2(xi1 x1)2 = 1 + 21

REVISION MULTIPLE REGRESSION

VARIANCE

I Assume homoskedasticity Var(u|x1, x2...xk) = 2I variance of u same for all combinations of outcomes of xs

Var(j) =2

(xij xj)(1 R2j )=

2

SSTj(1 R2j )(14)

I R2j is R2 from the regression of xj on all other xs

I size importantI We also have to estimate 2

I 2 = 1/(n k 1) u12 = SSR/dfI Varj =

2

SSTj(1R2j )I sej =

SSTj(1R2j )

REVISION MULTIPLE REGRESSION

EXAMPLE 3.9

The following equation describes the median housing price in acommunity in terms of amount of pollution (nox for nitrousoxide) and the average number of rooms in houses in thecommunity (rooms):

log(price) = 0 + 1log(nox) + 2rooms + u

1. What are the probable signs of the regression slopes?Interpret 1, explain.

2. Why might nox [more precisely, log(nox)] and rooms benegatively correlated? If this is the case, does the simpleregression of log(price) on log(nox) produce an upward ordownward biased estimator of 1?

REVISION MULTIPLE REGRESSION

EXAMPLE 3.9

3. Using the data in HPRICE2.RAW, the following equationswere estimated:

log( price) = 11.71 1.043log(nox),n = 506,R2 = 0.264

log( price) = 9.230.718log(nox)+0.306rooms,n = 506,R2 = 0.514Is the relationship between the simple and multiple regressionestimates of the elasticity of price with re

Recommended