chapter 5 heteroskedasticity. a regression line what is in this chapter? how do we detect this...

67
Chapter 5 Heteroskedasticity

Upload: olivia-atkinson

Post on 03-Jan-2016

217 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

Chapter 5

Heteroskedasticity

Page 2: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

A regression line

Page 3: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

What is in this Chapter?

• How do we detect this problem

• What are the consequences of this proble

m?

• What are the solutions?

Page 4: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What
Page 5: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

What is in this Chapter?

• First, We discuss tests based on OLS residuals,

likelihood ratio test, G-Q test and the B-P test.

The last one is an LM test.

• Regarding consequences, we show that the

OLS estimators are unbiased but inefficient and

the standard errors are also biased, thus

invalidating tests of significance

Page 6: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

What is in this Chapter?

• Regarding solutions, we discuss solutions depen

ding on particular assumptions about the error v

ariance and general solutions.

• We also discuss transformation of variables to lo

gs and the problems associated with deflators, b

oth of which are commonly used as solutions to t

he heteroskedasticity problem.

Page 7: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.1 Introduction• The homoskedasticity = variance of the error terms is

constant

• The heteroskedasticity = variance of the error terms is non-constant

• Illustrative Example 

• Table 5.1 presents consumption expenditures (y) and income (x) for 20 families.

• Suppose that we estimate the equation by ordinary least squares. We get (figures in parentheses are standard errors)

Page 8: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.1 Introduction

• We get (figures in parentheses are standard errors)

y=0.847 + 0.899 x R2 = 0.986 (0.703) (0.0253) RSS=31.074

Section 5.4

Page 9: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.1 Introduction

Page 10: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.1 Introduction

Page 11: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.1 Introduction

Page 12: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.1 Introduction

Page 13: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.1 Introduction

• The residuals from this equation are prese

nted in Table 5.3

• In this situation there is no perceptible incr

ease in the magnitudes of the residuals as

the value of x increases

• Thus there does not appear to be a hetero

skedasticity problem.

Page 14: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

• In the illustrative example in Section 5.1 we plott

ed estimated residual against to see wheth

er we notice any systematic pattern in the residu

als that suggests heteroskedasticity in the error.

• Note however, that by virtue if the normal equati

on, and are uncorrelated though could b

e correlated with .

tu tx

tu tx

tx

2ˆtu

Page 15: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

• Thus if we are using a regression procedure to t

est for heteroskedasticity, we should use a regre

ssion of on or a regression of or

• In the case of multiple regression, we should use

powers of , the predicted value of , or pow

ers of all the explanatory variables.

tu ....., 32tt xx

,.....,,onˆorˆ 322ttttt xxxuu

ty ty

Page 16: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

1. The test suggested by Anscombe and a test called

RESET suggested by Ramsey both involve regress

ing and testing whether or not the

coefficients are significant.

2. The test suggested by White involves regressing

on all the explanatory variables and their s

quares and cross products. For instance, with expl

anatory variables x1, x2, x3, it involves regressing

,.....ˆ,ˆonˆ 32ttt yyu

.and,,,,,,,,onˆ 13322123

22

21321

2 xxxxxxxxxxxxut

2ˆtu

Page 17: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

3. Glejser suggested estimating regressions of th

e type

and so on and testing the hypothesis

iiiiii xuxuxu ˆ,/ˆ,ˆ

0

Page 18: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

• The implicit assumption behind all these te

sts is that where zi

os an unknown variable and the different t

ests use different proxies or surrogates for

the unknown function f(z).

)()var( 22iii zfu

Page 19: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

034.0

10549.010236.0379.0ˆ2

3422

R

xxu

Page 20: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

Page 21: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

• Thus there is evidence of heteroskedasticity even in the log- linear from, although casually looking at the residuals in Table 5.3, we concluded earlier that the errors were homoskedastic.

• The Goldfeld-Quandt, to be discussed later in this section, also did not reject the hypothesis of homoskedasticity.

• The Glejser tests, however, show significant heteroskedasticity in the log-linear form.

Page 22: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

Assignment• Redo this illustrative example

– The figure of the absolute value of the residual and x variable• Linear form• Log-linear form

– Three types of tests:• Linear form and log-linear form• The e-view table• Reject/accept the null hypothesis of

homogenous variance

Page 23: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

• Some Other Tests (General tests)

– Likelihood Ratio Test

– Goldfeld and Quandt Test

– Breusch-Pagan Test

Page 24: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of HeteroskedasticityLikelihood Ratio Test

• If the number of observations is large, one can use a likelihood ratio test.

• Divide the residuals (estimated from the OLS regression) into k group with ni observations in the i th group, .

• Estimate the error variances in each group by .

• Let the estimate of the error variance from the entire sample be .Then if we define as

nn i

nnk

ii

i ˆ/)ˆ(1

2ˆ i

2

Page 25: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

• Goldfeld and Quandt Test

If we do not have large samples, we can use

the Goldfeld and Quandt test.

In this test we split the observations into two

groups — one corresponding to large values

of x and the other corresponding to small val

ues of x —

Page 26: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

Fit separate regressions for each and then a

pply an F-test to test the equality of error vari

ances.

Goldfeld and Quandt suggest omitting some

observations in the middle to increase our a

bility to discriminate between the two error v

ariances.

Page 27: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

Breusch-Pagan Test

• Suppose that .

• If there are some variables that i

nfluence the error variance and if

, then the Breusch and Pa

gan test is atest of the hypothesis

• The function can be any function.

2)( ttuV

rzzz ,......,, 21

)......( 221102

rtrttt zzzf

0....: 210 rH

)(f

Page 28: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

• For instance, f(x) can be ,and so on.• The Breusch and Pagan test does not depend on the

functional form.• Let

S0 = regression sum of squares from a

regression of

Then has a X 2 –distribution with d.f. r.

• This test is an asymptotic test. An intuitive justification for the test will be given after an illustrative example.

xexx ,, 2

rt zzzu ,.....,,onˆ 212

40 ˆ2/ S

n

ut2

2 ˆ

Page 29: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

Illustrative Example

• Consider the data in Table 5.1. To apply the Goldfeld-Quandt test we consider two groups of 10 observations each, ordered by the values of the variable x.

• The first group consists of observations 6, 11, 9, 4, 14, 15, 19, 20 ,1, and 16.

• The second group consists of the remaining 10.

Page 30: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

Illustrative Example• The estimate equations were

Group 1: y=1.0533+ 0.876 x R2 = 0.985 (0.616) (0.038) = 0.475

Group 2: y=3.279 + 0.835 x R2 = 0.904 (3.443) (0.096) = 3.154

2

2

Page 31: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

• The F- ratio for the test is

• The 1% point for the F-distribution with d.f. 8 and

8 is 6.03.

• Thus the F-value is significant at the 1% level an

d we reject the hypothesis if homoskedasticity.

64.6475.0

154.3F

Page 32: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

Group 1: log y = 0.128 + 0.934 x R2 = 0.992 (0.079) (0.030) = 0.001596

Group 2: log y = 0.276 + 0.902 x R2 = 0.912 (0.352) (0.099) = 0.002789

• The F-ratio for the test is

2

2

75.1001596.0

002789.0F

Page 33: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.2 Detection of Heteroskedasticity

• For d.f. 8 and 8, the 5% point from the F-tables is 3.44.

• Thus if we use the 5% significance level, we do not reject the hypothesis of homoskedasticity if we consider the linear form but do not reject it in the log-linear form.

• Note that the White test rejected the hypothesis in both the forms.

Page 34: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.3 Consequences of Heteroskedasticity

Page 35: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.4 Solutions to the Heteroskedasticity Problem

• There are two types of solutions that have been suggested in the literature for the problem of heteroskedasticity: 

– Solutions dependent on particular assumptions about σi.

– General solutions.

• We first discuss category 1: weighted least squares (WLS)

Page 36: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.4 Solutions to the Heteroskedasticity Problem

• WLS

i

i

i

i

i

ii

i

ii

i

vx

x

x

y

vz

x

zz

y

1

Page 37: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.4 Solutions to the Heteroskedasticity Problem

Thus the constant term in this equation is the slope coefficient in the original equation.

iii

i vxx

y 1

Page 38: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.4 Solutions to the Heteroskedasticity Problem

• Prais and Houthakker found in their analysis of family budget data that the errors from the equation had variance increasing with household income.

• They considered a model ,that is, .

• In this case we cannot divide the whole equation by a known constant as before.

• For this model we can consider a two-step procedure as follows.

222 )( ii yE 222 )( ii x

Page 39: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.4 Solutions to the Heteroskedasticity Problem

• First estimate and by OLS.

• Let these estimators be and .

• Now use the WLS procedure as outlined earlier,

that is, regress on and

with no constant term.

• The limitation of the two-step procedure: the error

involved in the first step will affect the second step

)ˆˆ/( ii xy )ˆˆ/(1 ix )ˆˆ/( ii xx

Page 40: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.4 Solutions to the Heteroskedasticity Problem

• This procedure is called a two-step weighted least

squares procedure.

• The standard errors we get for the estimates of

and from this procedure are valid only

asymptotically.

• The are asymptotic standard errors because the

weights have been estimated.

)/(1 ix

Page 41: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.4 Solutions to the Heteroskedasticity Problem

• One can iterate this WLS procedure further,

that is, use the new estimates of and to

construct new weights and then use the WLS

procedure, and repeat this procedure until

convergence.

• This procedure is called the iterated weighted

least squares procedure. However, there is no

gain in (asymptotic) efficiency by iteration.

Page 42: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.4 Solutions to the Heteroskedasticity Problem

• Illustrative Example

As an illustration, again consider the data in Table

5.1.We saw earlier that regressing the absolute val

ues of the residuals on x (in Glejser’s tests) gave th

e following estimates:

Now we regress (with

no constant term) where .

0512.0ˆ209.0ˆ

iiiii wxwwy /and/1on/

ii xw ˆˆ

Page 43: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.4 Solutions to the Heteroskedasticity Problem

The resulting equation is

If we assume that , the two-

step WLS procedure would be as follows.

Section 5.1

2210

2iii xx

9886.0)/(9176.0)/1(4843.0 2

)0157.0()1643.0(

Rwxww

yiii

i

i

Page 44: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.4 Solutions to the Heteroskedasticity Problem

• Next we compute

and regress .The results were

• The in these equations are not comparable. But our interest is in estimates of the parameters in the consumption function.

0037.0ˆ071.0ˆ493.0ˆ 210

22 0037.0071.0493.0 iii xxw

iiiii wxwwy /and/1on/

9982.0)/(9052.0)/1(7296.0 2

)0199.0()3302.0(

Rwxww

yiii

i

i

2R

Page 45: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

Assignment

• Use the data of Table 5.1 to do the WLS

• Consider the log-liner form

• Run the Glejser’s tests to check if the log-linear regressi

on model still has non-constant variance

• Estimate the non-constant variance and run the WLS

• Write a one-step program using Gauss program

Page 46: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• There are two remedies often suggested and used for solving the heteroskedasticity problem: 

– Transforming the data to logs

– Deflating the variables by some measure of "size."

Page 47: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

945.03676613.6884.1

614.01

000,065,3439.6827

365.0431.6016,13

2

)4730()375.0()906.2(

2

)000,393()682.0()5115(

2

)871.0()6218(

RXMC

RMM

X

M

C

RM

X

M

C

Page 48: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

826.006.61

3805

944.039.62811

2

)51.0()3713(

2

)18.0()4524(

RM

X

MM

C

RXC

Page 49: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• One important thing to note is that the purpose in all these procedures of deflation is to get more efficient estimates of the parameters

• But once those estimates have been obtained, one should make all inferences—calculation of the residuals, prediction of future values, etc., from the original equation—not the equation in the deflated variables.

Page 50: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• Another point to note is that since the purpose of deflation is to get more efficient estimates, it is tempting to argue about the merits of the different procedures by looking at the standard errors of the coefficients.

• However, this is not correct, because in the presence of heteroskedasticity the standard errors themselves are biased, as we showed earlier

Page 51: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• For instance, in the five equations presented above, the second and third are comparable and so are the fourth and fifth.

• In both cases if we look at the standard errors of the coefficient of X, the coefficient in the undeflated equation has a smaller standard error than the corresponding coefficient in the deflated equation.

• However, if the standard errors are biased, we have to be careful in making too much of these differences.

Page 52: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• In the preceding example we have considered

miles M as a deflator and also as an explanatory

variable.

• In this context we should mention some

discussion in the literature on "spurious

correlation" between ratios.

Page 53: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• The argument simply is that even if we have two

variables X and Y that are uncorrelated, if we

deflate both the variables by another variable Z,

there could be a strong correlation between X/Z

and Y/Z because of the common denominator

Z .

• It is wrong to infer from this correlation that there

exists a close relationship between X and Y.

Page 54: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• Of course, if our interest is in fact the relationship between X/Z and Y/Z, there is no reason why this correlation need be called "spurious."

• As Kuh and Meyer point out, "The question of spurious correlation quite obviously does not arise when the hypothesis to be tested has initially been formulated in terms of ratios, for instance, in problems involving relative prices.

Page 55: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• Similarly, when a series such as money value of output is divided by a price index to obtain a 'constant dollar' estimate of output, no question of spurious correlation need arise.

• Thus, spurious correlation can only exist when a hypothesis pertains to undeflated variables and the data have been divided through by another series for reasons extraneous to but not in conflict with the hypothesis framed an exact, i.e., nonstochastic relation.

Page 56: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• In summary, often in econometric work deflated or ratio variables are used to solve the heteroskedasticity problem

• Deflation can sometimes be justified on pure economic grounds, as in the case of the use of "real" quantities and relative prices

• In this case all the inferences from the estimated equation will be based on the equation in the deflated variables.

Page 57: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• However, if deflation is used to solve the heteroskedasticity problem, any inferences we make have to be based on the original equation, not the equation in the deflated variables

• In any case, deflation may increase or decrease the resulting correlations, but this is beside the point. Since the correlations are not comparable anyway, one should not draw any inferences from them.

Page 58: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• Illustrative Example

In Table 5.5 we present data on

y = population density

x = distance from the central business district

for 39 census tracts on the Baltimore area in 1970. It has been suggested (this is called the “density gradient model”) that population density follows the relationship

where A is the density of the central business district.

0 xeAy

Page 59: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• The basic hypothesis is that as you move away from the central business district population density drops off.

• For estimation purposes we take logs and write

xAy loglog

Page 60: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

where .

• Estimation of this equation by OLS gave the follo

wing results (figures in oarenthese are t-values,

not standard errors):

uxy *

Alogandlog* yy

803.02395.0093.10ˆ 2

)28.12()7.54(

*

RXy

Page 61: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• The t-values are very high and the coefficients

and significantly different from zero (with a si

gnificance level of less than 1%).The sign of is

negative, as expected.

• With cross-sectional data like these we expect h

eteroskedasticity, and this could result in an und

erestimation of the standard errors (and thus an

overestimation of the t-ratios).

Page 62: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• To check whether there is heteroskedasticity, we

have to analyze the estimated residuals .

• A plot if against showed a positive relatio

nship and hence Glejser’s tests were applied.ix

iu2ˆiu

Page 63: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• Defining by , the following equations

were estimated:

i

i

i

ii

i

iii

iii

vx

z

vx

z

vxz

vxz

1

1

iu iz

Page 64: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• We choose the specification that gives the highe

st [or equivalently the highest t-value, since

in the case of only one regressor.

2R

.)./( 222 fdttR

Page 65: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• The estimated regressions with t-values in parentheses were

i

i

ii

ii

ii

xz

xz

xz

xz

1038.1ˆ

)1

(390.1ˆ

1733.0ˆ

0445.0ˆ

)42.6(

)50.4(

)42.6(

)06.5(

Page 66: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• All the t-statistics are significant, indicating

the presence of heteroskedasticity.

• Based on the highest t-ratio, we chose the

second specification (although the fourth s

pecification is equally valid).

Page 67: Chapter 5 Heteroskedasticity. A regression line What is in this Chapter? How do we detect this problem What are the consequences of this problem? What

5.5 Heteroskedasticity and the Use of Deflators

• Deflating throughout by gives the regression equations to be estimated as

• The estimates were (figures in parentheses are t-ratios)

ix

errorxxx

yi

ii

i 1*

)10.15()87.47(2258.0ˆand932.9ˆ