may 2004 prof. himayatullah 1 basic econometrics chapter 7 multiple regression analysis: the problem...

14
May 2004 Prof. Himayatullah 1 Basic Econometrics Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

Upload: daniela-cook

Post on 14-Jan-2016

246 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah1

Basic EconometricsBasic Econometrics

Chapter 7MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

Page 2: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah2

7-1. The three-Variable Model: 7-1. The three-Variable Model: Notation and AssumptionsNotation and Assumptions

Yi = ß1+ ß2X2i + ß3X3i + u i (7.1.1) ß2 , ß3 are partial regression coefficients With the following assumptions:+ Zero mean value of U i:: E(u i|X2i,X3i) = 0. i (7.1.2)+ No serial correlation: Cov(ui,uj) = 0, i # j (7.1.3)+ Homoscedasticity: Var(u i) = 2 (7.1.4)+ Cov(ui,X2i) = Cov(ui,X3i) = 0 (7.1.5)+ No specification bias or model correct specified (7.1.6) + No exact collinearity between X variables (7.1.7)(no multicollinearity in the cases of more explanatory vars. If there is linear relationship exits, X vars. Are said to be linearly dependent)+ Model is linear in parameters

Page 3: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah3

7-2. Interpretation of Multiple 7-2. Interpretation of Multiple RegressionRegression

E(Yi| X2i ,X3i) = ß1+ ß2X2i + ß3X3i (7.2.1)

(7.2.1) gives conditional mean or

expected value of Y conditional upon the given or fixed value of the X2 and X3

Page 4: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah4

7-3. The meaning of partial 7-3. The meaning of partial regression coefficients regression coefficients

Yi= ß1+ ß2X2i + ß3X3 +….+ ßsXs+ ui ßk measures the change in the mean

value of Y per unit change in Xk, holding the rest explanatory variables

constant. It gives the “direct” effect of unit change in Xk on the E(Yi), net of Xj

(j # k) How to control the “true” effect of a

unit change in Xk on Y? (read pages 195-197)

Page 5: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah5

7-4. OLS and ML estimation of the 7-4. OLS and ML estimation of the partial regression coefficientspartial regression coefficients

This section (pages 197-201) provides: 1. The OLS estimators in the case of three-

variable regression

Yi= ß1+ ß2X2i + ß3X3+ ui 2. Variances and standard errors of OLS

estimators3. 8 properties of OLS estimators (pp 199-201)4. Understanding on ML estimators

Page 6: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah 6

7-5. The multiple coefficient of 7-5. The multiple coefficient of determination Rdetermination R22 and the multiple and the multiple coefficient of correlation Rcoefficient of correlation R

This section provides:1. Definition of R2 in the context of multiple

regression like r2 in the case of two-variable regression

2. R = R2 is the coefficient of multiple regression, it measures the degree of association between Y and all the explanatory variables jointly

3. Variance of a partial regression coefficientVar(ß^k) = 2/ x2

k (1/(1-R2k)) (7.5.6)

Where ß^k is the partial regression coefficient of regressor Xk and R2

k is the R2 in the

regression of Xk on the rest regressors

Page 7: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah7

7-6. Example 7.1: The 7-6. Example 7.1: The expectations-augmented Philips expectations-augmented Philips Curve for the US (1970-1982)Curve for the US (1970-1982)

This section provides an illustration for the ideas introduced in the chapter

Regression Model (7.6.1) Data set is in Table 7.1

Page 8: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah8

7-7. Simple regression in the 7-7. Simple regression in the context of multiple regression: context of multiple regression: Introduction to specification biasIntroduction to specification bias

This section provides an understanding on “ Simple regression in the context of multiple regression”. It will cause the specification bias which will be discussed in Chapter 13

Page 9: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah9

7-8. R7-8. R22 and the Adjusted-R and the Adjusted-R2 2

R2 is a non-decreasing function of the number of explanatory variables. An additional X variable will not decrease R2

R2= ESS/TSS = 1- RSS/TSS = 1-u^2I / y^2

i (7.8.1) This will make the wrong direction by adding more

irrelevant variables into the regression and give an idea for an adjusted-R2 (R bar) by taking account of degree of freedom

R2bar= 1- [ u^2

I /(n-k)] / [y^2i /(n-1) ] , or (7.8.2)

R2bar= 1- ^2

/ S2Y (S2

Y is sample variance of Y) K= number of parameters including intercept term– By substituting (7.8.1) into (7.8.2) we get R2

bar = 1- (1-R2) (n-1)/(n- k) (7.8.4)– For k > 1, R2

bar < R2 thus when number of X variables

increases R2bar increases less than R2 and R2

bar can be negative

Page 10: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah10

7-8. R7-8. R22 and the Adjusted-R and the Adjusted-R2 2

R2 is a non-decreasing function of the number of explanatory variables. An additional X variable will not decrease R2

R2= ESS/TSS = 1- RSS/TSS = 1-u^2I / y^2

i (7.8.1) This will make the wrong direction by adding more

irrelevant variables into the regression and give an idea for an adjusted-R2 (R bar) by taking account of degree of freedom

R2bar= 1- [ u^2

I /(n-k)] / [y^2i /(n-1) ] , or (7.8.2)

R2bar= 1- ^2

/ S2Y (S2

Y is sample variance of Y) K= number of parameters including intercept term– By substituting (7.8.1) into (7.8.2) we get R2

bar = 1- (1-R2) (n-1)/(n- k) (7.8.4)– For k > 1, R2

bar < R2 thus when number of X variables

increases R2bar increases less than R2 and R2

bar can be negative

Page 11: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah11

7-8. R7-8. R22 and the Adjusted-R and the Adjusted-R2 2

Comparing Two R2 Values:To compare, the size n and the dependent variable must be the same

Example 7-2: Coffee Demand Function Revisited (page 210)

The “game” of maximizing adjusted-R2: Choosing the model that gives the highest R2

bar may be dangerous, for in regression our objective is not for that but for obtaining the dependable estimates of the true population regression coefficients and draw statistical inferences about them

Should be more concerned about the logical or theoretical relevance of the explanatory variables to the dependent variable and their statistical significance

Page 12: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. HimayatullahProf. Himayatullah 12

7-9. Partial Correlation 7-9. Partial Correlation CoefficientsCoefficients

This section provides:

1. Explanation of simple and partial correlation coefficients

2. Interpretation of simple and partial correlation coefficients

(pages 211-214)

Page 13: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah13

7-10. Example 7.3: The Cobb-7-10. Example 7.3: The Cobb-Douglas Production functionDouglas Production functionMore on functional formMore on functional form

Yi = 1X22i X3

3ieUi (7.10.1)

By log-transform of this model: lnYi = ln1 + 2ln X2i + 3ln X3i + Ui

= 0 + 2ln X2i + 3ln X3i + Ui

(7.10.2)

Data set is in Table 7.3Report of results is in page 216

Page 14: May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004Prof. Himayatullah14

7-11 Polynomial Regression 7-11 Polynomial Regression ModelsModels

Yi = 0 + 1 Xi + 2 X2i +…+ k Xk

i + Ui

(7.11.3) Example 7.4: Estimating the Total Cost

Function Data set is in Table 7.4 Empirical results is in page 221

-------------------------------------------------------------- 7-12. Summary and Conclusions

(page 221)