chapter 11 multiple regression ©. multiple regression model multiple regression enables us to...

34
Chapter 11 Chapter 11 Multiple Multiple Regression Regression ©

Post on 21-Dec-2015

241 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Chapter 11Chapter 11

Multiple Multiple RegressionRegression

©

Page 2: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Multiple Regression Multiple Regression ModelModel

Multiple regression enables us to determine the simultaneous effect of several independent variables on a dependent variable using the least squares principle.

),,,( 21 KXXXfY

Page 3: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Multiple Regression Multiple Regression ObjectivesObjectives

Multiple regression provides two important results:1. A linear equation that predicts the dependent

variable, Y, as a function of “K” independent variables, xji, j = 1 , . . K.

2. The marginal change in the dependent variable, Y, that is related to a change in the independent variables – measured by the partial coefficients, bj’s. In multiple regression these partial coefficients depend on what other variables are included in the model. The coefficients bj indicates the change in Y given a unit change in xj while controlling for the simultaneous effect of the other independent variables. (In some problems both results are equally important. However, usually one will predominate.

kikii xbxbxbby 22110ˆ

Page 4: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Multiple Regression ModelMultiple Regression Model(Example 11.1)(Example 11.1)

Year Revenue Number of Offices Profit Margin1 3.92 7298 0.752 3.61 6855 0.713 3.32 6636 0.664 3.07 6506 0.615 3.06 6450 0.76 3.11 6402 0.727 3.21 6368 0.778 3.26 6340 0.749 3.42 6349 0.9

10 3.42 6352 0.8211 3.45 6361 0.7512 3.58 6369 0.7713 3.66 6546 0.7814 3.78 6672 0.8415 3.82 6890 0.7916 3.97 7115 0.717 4.07 7327 0.6818 4.25 7546 0.7219 4.41 7931 0.5520 4.49 8097 0.6321 4.7 8468 0.5622 4.58 8717 0.4123 4.69 8991 0.5124 4.71 9179 0.4725 4.78 9318 0.32

Page 5: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Multiple Regression Multiple Regression ModelModel

POPULATION MULTIPLE REGRESSION MODELPOPULATION MULTIPLE REGRESSION MODELThe population multiple regression model defines the relationship between a dependent or endogenous variable, Y, and a set of independent or exogenous variables, xj, j=1, . . , K. The xji’s are assumed to be fixed numbers and Y is a random variable, defined for each observation, i, where i = 1, . . ., n and n is the number of observations. The model is defined as

Where the j’s are constant coefficients and the ’s are random variables with mean 0 and variance 2.

iKiKiii xxxY 22110

Page 6: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Standard Multiple Standard Multiple Regression AssumptionsRegression Assumptions

The population multiple regression model is

and we assume that n sets of observations are available. The following standard assumptions are made for the model.

1. The x’s are fixed numbers, or they are realizations of random variables, Xji that are independent of the error terms, i’s. In the later case, inference is carried out conditionally on the observed values of the xji’s.

2. The error terms are random variables with mean 0 and the same variance, 2. The later is called homoscedasticityhomoscedasticity or uniform variance. n), 1,(ifor ][0][ 22 ii EandE

iKiKiii xxxY 22110

Page 7: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Standard Multiple Standard Multiple Regression AssumptionsRegression Assumptions

(continued)(continued)

3. The random error terms, i , are not correlated with one another, so that

4. It is not possible to find a set of numbers, c0, c1, . . . , ck, such that

This is the property of no linear relation for the Xj’s.

ji allfor 0][ jiE

022110 KiKii xcxcxcc

Page 8: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Least Squares Estimation and the Least Squares Estimation and the Sample Multiple RegressionSample Multiple Regression

We begin with a sample of n observations denoted as (x1i, x2i, . . ., xKi, yi i = 1, . . ,n) measured for a process whose population multiple regression model is

The least-squares procedure obtains estimates of the coefficients, 1, 2, . . .,K are the values b0 , b1, . . ., bK, for which the sum of the squared deviations

is a minimum.The resulting equation

is the sample multiple regressionsample multiple regression of Y on X1, X2, . . ., XK.

KKi xbxbxbby 22110

n

iKiKiii xbxbxbbySSE

1

222110 )(

iKiKiii xxxY 22110

Page 9: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Multiple Regression Analysis Multiple Regression Analysis for Profit Margin Analysisfor Profit Margin Analysis

(Using Example 11.1)(Using Example 11.1)

The regression equation is:

Y Profit Margin = 1.56 + 0.382 X1 Revenue – 0.00025 X2 Office SpaceRegression Statistics

Multiple R 0.930212915R Square 0.865296068Adjusted R Square 0.853050256Standard Error 0.053302217Observations 25

ANOVAdf SS MS F Significance F

Regression 2 0.40151122 0.20075561 70.66057082 2.64962E-10Residual 22 0.06250478 0.002841126Total 24 0.464016

Coefficients Standard Error t Stat P-valueIntercept 1.564496771 0.079395981 19.70498685 1.81733E-15Revenue 0.237197475 0.055559366 4.269261695 0.000312567Number of Offices -0.000249079 3.20485E-05 -7.771949195 9.50879E-08

b0 b1 b2

Page 10: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Sum of Squares Decomposition and Sum of Squares Decomposition and the Coefficient of Determinationthe Coefficient of Determination

Given the multiple regression model fitted by least squares

Where the bj’s are the least squares estimates of the coefficients of the population regression model and e’s are the residuals from the estimated regression model.

The model variability can be partitioned into the components

Where

Total Sum of Squares

SSESSRSST

n

ii yySST

1

2)(

n

iii

n

ii yyyySST

1

2

1

2 )ˆ()ˆ(

iiiKiKiii eyexbxbxbby ˆ22110

Page 11: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Sum of Squares Decomposition and Sum of Squares Decomposition and the Coefficient of Determinationthe Coefficient of Determination

(continued)(continued)

Error Sum of Squares:

Regression Sum of Squares:

This decomposition can be interpreted as

n

ii yySSR

1

2)ˆ(

n

ii

n

iii eyySSE

1

2

1

2 ˆ)ˆ(

yvariabilit dUnexplaine y variabilit Explained y variabilit sampleTotal

Page 12: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Sum of Squares Decomposition and Sum of Squares Decomposition and the Coefficient of Determinationthe Coefficient of Determination

(continued)(continued)

The coefficient of determination,coefficient of determination, RR22, of the fitted regression is defined as the proportion of the total sample variability explained by the regression and is

and it follows that

SST

SSE

SST

SSRR 12

10 2 R

Page 13: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Estimation of Error VarianceEstimation of Error Variance

Given the population regression model

And the standard regression assumptions, let 2 denote

the common variance of the error term i. Then an unbiased estimate of that variance is

The square root of the variance, Se is also called the standard error of the estimate.

111

2

2

Kn

SSE

Kn

es

n

ii

e

iKiKiii xxxY 22110

Page 14: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Multiple Regression Analysis Multiple Regression Analysis for Profit Margin Analysisfor Profit Margin Analysis

(Using Example 11.1)(Using Example 11.1)

The regression equation is:

Y Profit Margin = 1.56 + 0.382 X1 Revenue – 0.00025 X2 Office SpaceRegression Statistics

Multiple R 0.930212915R Square 0.865296068Adjusted R Square 0.853050256Standard Error 0.053302217Observations 25

ANOVAdf SS MS F Significance F

Regression 2 0.40151122 0.20075561 70.66057082 2.64962E-10Residual 22 0.06250478 0.002841126Total 24 0.464016

Coefficients Standard Error t Stat P-valueIntercept 1.564496771 0.079395981 19.70498685 1.81733E-15Revenue 0.237197475 0.055559366 4.269261695 0.000312567Number of Offices -0.000249079 3.20485E-05 -7.771949195 9.50879E-08

b0 b1 b2

se

SSRSSE

R2

Page 15: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Adjusted Coefficient of Adjusted Coefficient of DeterminationDetermination

The adjustedadjusted coefficient of determination,coefficient of determination, RR22, is defined as

We use this measure to correct for the fact that non-relevant independent variables will result in some small reduction in the error sum of squares. Thus the adjusted R2 provides a better comparison between multiple regression models with different numbers of independent variables.

)1/(

)1/(12

nSST

KnSSER

Page 16: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Coefficient of Multiple Coefficient of Multiple CorrelationCorrelation

The coefficient of multiple correlationcoefficient of multiple correlation, is the correlation between the predicted value and the observed value of the dependent variable

and is equal to the square root of the multiple coefficient of determination. We use R as another measure of the strength of the linear relationship between the dependent variable and the independent variables. Thus it is comparable to the correlation between Y and X in simple regression.

2),ˆ( RyYCorrR

Page 17: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Basis for Inference About the Basis for Inference About the Population Regression Population Regression

ParametersParameters

Let the population regression model be

Let b0, b1 , . . , bK be the least squares estimates of the population parameters and sb0, sb1, . . ., sbK be the estimated standard deviations of the least squares estimators. Then if the standard regression assumptions hold and if the error terms i are normally distributed, the random variables corresponding to

are distributed as Student’s t with (n – K – 1) degrees of freedom.

),,2,1( Kjs

bt

j

j

b

jjb

iKiKiii xxxY 22110

Page 18: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Confidence Intervals for Partial Confidence Intervals for Partial Regression CoefficientsRegression Coefficients

If the regression errors i , are normally distributed and the standard regression assumptions hold, the 100(1 - )% confidence intervals for the partial regression coefficients j, are given by

Where t(n – K - 1, /2) is the number for which

And the random variable t(n – K - 1) follows a Student’s t distribution with (n – K - 1) degrees of freedom.

jj bKnjjbKnj stbstb )2/,1()2/,1(

2/)( )2/,1()1( KnKn ttP

Page 19: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Multiple Regression Analysis Multiple Regression Analysis for Profit Margin Analysisfor Profit Margin Analysis

(Using Example 11.1)(Using Example 11.1)

The regression equation is:

Y Profit Margin = 1.56 + 0.382 X1 Revenue – 0.00025 X2 Office SpaceRegression Statistics

Multiple R 0.930212915R Square 0.865296068Adjusted R Square 0.853050256Standard Error 0.053302217Observations 25

ANOVAdf SS MS F Significance F

Regression 2 0.40151122 0.20075561 70.66057082 2.64962E-10Residual 22 0.06250478 0.002841126Total 24 0.464016

Coefficients Standard Error t Stat P-valueIntercept 1.564496771 0.079395981 19.70498685 1.81733E-15Revenue 0.237197475 0.055559366 4.269261695 0.000312567Number of Offices -0.000249079 3.20485E-05 -7.771949195 9.50879E-08

b1 b2 tb2 tb1

Page 20: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Tests of Hypotheses for the Partial Tests of Hypotheses for the Partial Regression CoefficientsRegression Coefficients

If the regression errors i are normally distributed and the standard least squares assumptions hold, the following tests have significance level :

1. To test either null hypothesis

against the alternative

the decision rule is

*0*0 :: jj HorH

,1

*j0

b if HReject

Kn

b

ts

j

*1 : jH

Page 21: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Tests of Hypotheses for the Partial Tests of Hypotheses for the Partial Regression Coefficients Regression Coefficients

(continued)(continued)

2. To test either null hypothesis

against the alternative

the decision rule is

*0*0 :: jj HorH

*1 : jH

,1

*j0

b if HReject

Kn

b

ts

j

Page 22: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Tests of Hypotheses for the Partial Tests of Hypotheses for the Partial Regression Coefficients Regression Coefficients

(continued)(continued)

3. To test the null hypothesis

Against the two-sided alternative

the decision rule is

*1 : jH

*0 : jH

2/,1*j

2/,1*j

0

bb if HReject

Kn

bKn

b

ts

orts

jj

Page 23: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Test on All the Parameters of a Test on All the Parameters of a Regression ModelRegression Model

Consider the multiple regression model

To test the null hypothesis

against the alternative hypothesis

At a significance level we can use the decision rule

Where F K,n – K –1, is the critical value of F from Table 7 in the appendix for which

The computed F K,n – K –1 follows an F distribution with numerator degrees of freedom k and denominator degrees of freedom (n – K – 1)

0: 210 KH

,1,21-K-nk,0

/F if HReject Knk

e

Fs

KSSR

0 theof oneleast At :1 jH

iKiKiii xxxY 22110

),1,1 Knkk,n-K- FP(F

Page 24: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Test on a Subset of the Regression Test on a Subset of the Regression ParametersParameters

Consider the multiple regression model

To test the null hypothesis

That a subset of regression parameters are simultaneously equal to 0 against the alternative hypothesis

0: 210 rH

),,1(0 theof oneleast At :1 rjH j

iririKiKii ZZxxY 11110

Page 25: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Test on a Subset of the Regression Test on a Subset of the Regression ParametersParameters

(continued)(continued)

We compare the error sum of squares for the complete model with the error sum of squares for the restricted model. First run a regression for the complete model that includes all the independent variables and obtain SSE. Next run a restricted regression that excludes the Z variables whose coefficients are the ’s - - the number of variables excluded is r. From this regression obtain the restricted error sum of squares SSE (r). The compute the F statistic and apply the decision rule for a significance level

,1,20

/))((F if HReject

rKnr

e

Fs

rSSErSSE

Page 26: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Predictions from the Multiple Predictions from the Multiple Regression ModelsRegression Models

Given that the population regression model

holds and that the standard regression assumptions are valid. Let b0, b1, . . . , bK be the least squares estimates of the model coefficients, j, j = 1, 2, . . . ,K, based on the x1i, x2i, . . . , xKi, yi

(i = 1, 2, . . . n) data points. Then given a new observation of a data point, x1,n+1, x 2,n+1, . . . , x K,n+1 the best linear unbiased forecast of Y n+1 is

It is very risky to obtain forecasts that are based on X values outside the range of the data used to estimate the model coefficients, because we do not have data evidence to support the linear model at those points.

),,2,1(22110 nixxxY iKiKiii

1,1,221,1101ˆ nKKnnn xbxbxbby

Page 27: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Quadratic Model TransformationsQuadratic Model Transformations

The quadratic functionquadratic function

Can be transformed into a linear multiple regression model by defining new variables:

And then specifying the model as

Which is linear in the transformed variables. Transformed quadratic variables can be combined with other variables in a multiple regression model. Thus we could fit a multiple quadratic regression using transformed variables.

212110 XXY

iiii zzY 22110

212

11

xz

xz

Page 28: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Exponential Model TransformationsExponential Model Transformations

Coefficients for exponential modelsexponential models of the form

Can be estimated by first taking the logarithm of both sides to obtain an equation that is linear in the logarithms of the variables:

Using this form we can regress the logarithm of Y on the logarithm of the two X variables and obtain estimates for the coefficients 1, 2 directly from the regression analysis. Note that this estimation procedure requires that the random errors are multiplicative in the original exponential model. Thus the error term, , is expressed as a percentage increase or decrease instead of the addition or subtraction of a random error as we have seen for linear regression models.

21210 XXY

)log()log()log()log()log( 22110 XXY

Page 29: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Dummy Variable Regression Dummy Variable Regression AnalysisAnalysis

The relationship between Y and X1

can shift in response to a changed condition. The shift effect can be estimated by using a dummy variabledummy variable which has values of 0 (condition not present) and 1 (condition present). All of the observations from one set of data have dummy variable X2 = 1, and the observations for the other set of data have X2 = 0. In these cases the relationship between Y and X1 is specified by the regression model

110 XY

11220ˆ xbxbby

Page 30: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Dummy Variable Regression Dummy Variable Regression AnalysisAnalysis(continued)(continued)

The functions for each set of points are

and

In the first function the constant is b0, while in the second the constant is b0 + b2. Dummy variables are also called indicator variablesindicator variables.

0ˆ2110 XwhenxbbY

1)(ˆ211220 XwhenxbxbbY

Page 31: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Dummy Variable Regression for Dummy Variable Regression for Differences in SlopeDifferences in Slope

To determine if there are significant differences in the slope between two discrete conditions we need to expand our regression model to a more complex form

Now we see that the slope coefficient of x1 contains two components, b1, and b3x2. When x2 equals 0, the slope estimate is the usual b1. However, when x2 equals 1, the slope is equal to the algebraic sun of b1 + b3. To estimate the model we actually need to multiply the variables to create a new set of transformed variables that are linear. Therefore the model actually used for the estimation is

1231220 )(ˆ xxbbxbby

21311220ˆ xxbxbxbby

Page 32: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Dummy Variable Regression for Dummy Variable Regression for Differences in SlopeDifferences in Slope

(continued)(continued)

The resulting regression model is now linear with three variables. The new variable x1x2 is often called an interaction variable. Note that when the dummy variable x2 = 0 this variable has a value of 0, but when x2 = 1 this variable has the value of x1. The coefficient b3 is an estimate of the difference in the coefficient of x1 when x2 = 1 compared to when x2 = 0. Thus the t statistic for b3 can be used to test the hypothesis

If we reject the null hypothesis we conclude that there is a difference in the slope coefficient for the two subgroups. In many cases we will be interested in both the difference in the constant and difference in the slope and will test both of the hypotheses presented in this section.

0,0|0:

0,0|0:

2131

2130

H

H

Page 33: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Key WordsKey Words

Adjusted Coefficient of Determination

Basis for Inference About the Population Regression Parameters

Coefficient of Multiple Determination

Confidence Intervals for Partial Regression Coefficients

Dummy Variable Regression Analysis

Dummy Variable Regression for Differences in Slope

Estimation of Error Variance

Least Squares Estimation and the Sample Multiple Regression

Prediction from Multiple Regression Models

Quadratic Model Transformations

Page 34: Chapter 11 Multiple Regression ©. Multiple Regression Model Multiple regression enables us to determine the simultaneous effect of several independent

Key WordsKey Words(continued)(continued)

Regression Objectives Standard Error of the

Estimate Standard Multiple

Regression Assumptions

Sum of Squares Decomposition and the Coefficient of Determination

Test on a Subset of the Regression Parameters

Test on All the Parameters of a Regression Model

Tests of Hypotheses for the Partial Regression Coefficients

The Population Multiple Regression Model