logistic regression analysis gerrit rooks 30-03-10

60
Logistic Regression Analysis Gerrit Rooks 30-03-10

Upload: gertrude-stevens

Post on 06-Jan-2018

216 views

Category:

Documents


1 download

DESCRIPTION

Suppose we have 100 observations with information about an individuals age and wether or not this indivual had some kind of a heart disease (CHD) IDageCHD …

TRANSCRIPT

Page 1: Logistic Regression Analysis Gerrit Rooks 30-03-10

Logistic Regression Analysis

Gerrit Rooks30-03-10

Page 2: Logistic Regression Analysis Gerrit Rooks 30-03-10

This lecture1. Why do we have to know and sometimes use

logistic regression?2. What is the model? What is maximum likelihood

estimation?3. Logistics of logistic regression analysis

1. Estimate coefficients2. Assess model fit3. Interpret coefficients4. Check residuals

4. An SPSS example

Page 3: Logistic Regression Analysis Gerrit Rooks 30-03-10

Suppose we have 100 observations with information about an individuals age and wether or not this indivual

had some kind of a heart disease (CHD)

ID age CHD

1 20 02 23 03 24 04 25 1…98 64 099 65 1100 69 1

Page 4: Logistic Regression Analysis Gerrit Rooks 30-03-10

A graphic representation of the data

Page 5: Logistic Regression Analysis Gerrit Rooks 30-03-10

Suppose, as a researcher I am interested in the relation between age and the probability of CHD

Page 6: Logistic Regression Analysis Gerrit Rooks 30-03-10

To try to predict the probability of CHD, I can regress CHD on Age

pr(CHD|age) = -.54 +.0218107*Age

Page 7: Logistic Regression Analysis Gerrit Rooks 30-03-10

However, linear regression is not a suitable model for probalities.

pr(CHD|age) = -.54 +.0218107*Age

Page 8: Logistic Regression Analysis Gerrit Rooks 30-03-10

In this graph for 8 age groups, I plotted the probability of having a heart disease (proportion)

Page 9: Logistic Regression Analysis Gerrit Rooks 30-03-10

Instead of a linear probality model, I need a non-linear one

Page 10: Logistic Regression Analysis Gerrit Rooks 30-03-10

Something like this

Page 11: Logistic Regression Analysis Gerrit Rooks 30-03-10

This is the logistic regression model

)( 111011)|Pr(

XbbeXY

Page 12: Logistic Regression Analysis Gerrit Rooks 30-03-10

Predicted probabilities are always between 0 and 1

)( 111011)|Pr(

XbbeXY

similar to classic regressionanalysis

Page 13: Logistic Regression Analysis Gerrit Rooks 30-03-10

Logistics of logistic regression

1. How do we estimate the coefficients? 2. How do we assess model fit?3. How do we interpret coefficients? 4. How do we check regression assumptions?

Page 14: Logistic Regression Analysis Gerrit Rooks 30-03-10

Logistics of logistic regression

1. How do we estimate the coefficients? 2. How do we assess model fit?3. How do we interpret coefficients? 4. How do we check regression? assumptions ?

Page 15: Logistic Regression Analysis Gerrit Rooks 30-03-10

Maximum likelihood estimation

• Method of maximum likelihood yields values for the unknown parameters which maximize the probability of obtaining the observed set of data.

)( 111011)|Pr(

XbbeXY

Unknown parameters

Page 16: Logistic Regression Analysis Gerrit Rooks 30-03-10

Maximum likelihood estimation

• First we have to construct the likelihood function (probability of obtaining the observed set of data).

Likelihood = pr(obs1)*pr(obs2)*pr(obs3)…*pr(obsn)

Assuming that observations are independent

Page 17: Logistic Regression Analysis Gerrit Rooks 30-03-10

ID age CHD

1 20 02 23 03 24 04 25 1

…98 64 099 65 1100 69 1

)( 110111)0Pr(

Agebbe

)( 11011)1Pr(

Agebbe

Page 18: Logistic Regression Analysis Gerrit Rooks 30-03-10

The likelihood function (for the CHD data)

)1

1(*...

*)1

1(*)1

11(*

)1

11(*)1

11(

)(

)()(

)()(

110

110110

110110

Agebb

AgebbAgebb

AgebbAgebb

e

ee

eelikelihood

Given that we have 100 observations I summarize the function

Page 19: Logistic Regression Analysis Gerrit Rooks 30-03-10

Log-likelihood

• For technical reasons the likelihood is transformed in the log-likelihood

LL= ln[pr(obs1)]+ln[pr(obs2)]+ln[pr(obs3)]…+ln[pr(obsn)]

Page 20: Logistic Regression Analysis Gerrit Rooks 30-03-10

The likelihood function (for the CHD data)

)1

1(*...

*)1

1(*)1

11(*

)1

11(*)1

11(

)(

)()(

)()(

110

110110

110110

Agebb

AgebbAgebb

AgebbAgebb

e

ee

eelikelihood

A clever algorithm gives us values for the parameters b0 and b1 that maximize the likelihood of this data

Page 21: Logistic Regression Analysis Gerrit Rooks 30-03-10

Estimation of coefficients: SPSS Results

Variables in the Equation

B S.E. Wald df Sig. Exp(B)

Step 1a age ,111 ,024 21,254 1 ,000 1,117

Constant -5,309 1,134 21,935 1 ,000 ,005

a. Variable(s) entered on step 1: age.

)11.3.5( 111)|Pr( Xe

XY

Page 22: Logistic Regression Analysis Gerrit Rooks 30-03-10

)11.3.5( 111)|Pr( Xe

XY

Page 23: Logistic Regression Analysis Gerrit Rooks 30-03-10

)11.3.5( 111)|Pr( Xe

XY

This function fits very good, other values of b0 and b1 give worse results

Page 24: Logistic Regression Analysis Gerrit Rooks 30-03-10

Illustration 1: suppose we chose .05X instead of .11X

)05.3.5( 111)|Pr( Xe

XY

Page 25: Logistic Regression Analysis Gerrit Rooks 30-03-10

)40.3.5( 111)|Pr( Xe

XY

Illustration 2: suppose we chose .40X instead of .11X

Page 26: Logistic Regression Analysis Gerrit Rooks 30-03-10

Logistics of logistic regression

• Estimate the coefficients • Assess model fit• Interpret coefficients • Check regression assumptions

Page 27: Logistic Regression Analysis Gerrit Rooks 30-03-10

Logistics of logistic regression

• Estimate the coefficients • Assess model fit

– Between model comparisons– Pseudo R2 (similar to multiple regression)– Predictive accuracy

• Interpret coefficients • Check regression assumptions

Page 28: Logistic Regression Analysis Gerrit Rooks 30-03-10

28

Model fit: Between model comparison

)]baseline()New([22 LLLL

The log-likelihood ratio test statistic can be used to test the fit of a model

The test statistic has achi-square distribution

reduced modelfull model

Page 29: Logistic Regression Analysis Gerrit Rooks 30-03-10

29

Between model comparisons: likelihood ratio test

)( 11011)(P Xbbe

Y

)]baseline()New([22 LLLL

reduced modelfull model

)( 011)(P be

Y

The model including only an interceptIs often called the empty model. SPSS uses this model as a default.

Page 30: Logistic Regression Analysis Gerrit Rooks 30-03-10

30

Between model comparisons: Test can be used for individual coefficients

)( 2211011)(P XbXbbe

Y

)]baseline()New([22 LLLL

reduced modelfull model

)( 011)(P be

Y

Page 31: Logistic Regression Analysis Gerrit Rooks 30-03-10

)]baseline(2)New(22 LLLL

29.31 = -107,35 – 2LL(baseline)

Omnibus Tests of Model Coefficients

Chi-square df Sig.

Step 1 Step 29,310 1 ,000

Block 29,310 1 ,000

Model 29,310 1 ,000

Model Summary

Step -2 Log likelihood

Cox & Snell R

Square

Nagelkerke R

Square

1 107,353a ,254 ,341

a. Estimation terminated at iteration number 5 because

parameter estimates changed by less than ,001.

-2LL(baseline) = 136,66

This is the test statistic,and it’s associated significance

Between model comparison: SPSS output

Page 32: Logistic Regression Analysis Gerrit Rooks 30-03-10

32

Overall model fitpseudo R2

Just like in multiple regression, pseudo R2 ranges 0.0 to 1.0

– Cox and Snell• cannot theoretically

reach 1– Nagelkerke

• adjusted so that it can reach 1

)(2)(2

LOGIT2

EmptyLLModelLLR

log-likelihood of modelbefore any predictors wereentered

log-likelihood of the modelthat you want to test

NOTE: R2 in logistic regression tends to be (even) smaller than in multiple regression

Page 33: Logistic Regression Analysis Gerrit Rooks 30-03-10

33

Overall model fit: Classification table

We correctly predict 74% of our observation

Classification Tablea

Observed

Predicted

chd

0 1

Percentage

Correct

Step 1 chd 0 45 12 78,9

1 14 29 67,4

Overall Percentage 74,0

a. The cut value is ,500

Page 34: Logistic Regression Analysis Gerrit Rooks 30-03-10

34

Overall model fit: Classification table

14 cases had a CHD while according to our modelthis shouldnt have happened.

Classification Tablea

Observed

Predicted

chd

0 1

Percentage

Correct

Step 1 chd 0 45 12 78,9

1 14 29 67,4

Overall Percentage 74,0

a. The cut value is ,500

Page 35: Logistic Regression Analysis Gerrit Rooks 30-03-10

35

Overall model fit: Classification table

12 cases didnt have a CHD while according to our modelthis should have happened.

Classification Tablea

Observed

Predicted

chd

0 1

Percentage

Correct

Step 1 chd 0 45 12 78,9

1 14 29 67,4

Overall Percentage 74,0

a. The cut value is ,500

Page 36: Logistic Regression Analysis Gerrit Rooks 30-03-10

Logistics of logistic regression

• Estimate the coefficients • Assess model fit• Interpret coefficients • Check regression assumptions

Page 37: Logistic Regression Analysis Gerrit Rooks 30-03-10

Logistics of logistic regression

• Estimate the coefficients • Assess model fit• Interpret coefficients

– Direction– Significance– Magnitude

• Check regression assumptions

Page 38: Logistic Regression Analysis Gerrit Rooks 30-03-10

38

Interpreting coefficients: direction

)(

)(

)( 1110

1110

1110 111)(

Xbb

Xbb

Xbb ee

eYP

We can rewrite our LRM as follows:

into:

nnxbxbxbb eeeeypyp

...)(1

)(Odds 22110

Page 39: Logistic Regression Analysis Gerrit Rooks 30-03-10

39

Interpreting coefficients: direction• original b reflects changes in logit: b>0 -> positive relationship

• exponentiated b reflects the changes in odds: exp(b) > 1 -> positive relationship

nnxbxbxbbypyp

...)(1

)(lnlogit 22110

nnxbxbxbb eeeeypyp

...)(1

)(Odds 22110

Page 40: Logistic Regression Analysis Gerrit Rooks 30-03-10

40

Interpreting coefficients: direction

We can rewrite our LRM as follows:

into:

nnxbxbxbbypyp

...)(1

)(lnlogit 22110

nnxbxbxbbypypOdds

...

)(1)(

22110

Page 41: Logistic Regression Analysis Gerrit Rooks 30-03-10

41

Interpreting coefficients: direction• original b reflects changes in logit: b>0 -> positive relationship

• exponentiated b reflects the changes in odds: exp(b) > 1 -> positive relationship

nnxbxbxbbypyp

...)(1

)(lnlogit 22110

nnxbxbxbb eeeeypyp

...)(1

)(Odds 22110

Page 42: Logistic Regression Analysis Gerrit Rooks 30-03-10

42

Testing significance of coefficients

• In linear regression analysis this statistic is used to test significance

• In logistic regression something similar exists

• however, when b is large, standard error tends to become inflated, hence underestimation (Type II errors are more likely)

b

bSE

Wald

t-distribution standard error of estimate

estimate

Note: This is not the Wald Statistic SPSS presents!!!

Page 43: Logistic Regression Analysis Gerrit Rooks 30-03-10

Interpreting coefficients: significance

• SPSS presents

• While Andy Field thinks SPSS presents this:

bSEb

2

2

Wald

b

bSE

Wald

Page 44: Logistic Regression Analysis Gerrit Rooks 30-03-10

44

3. Interpreting coefficients: magnitude

• The slope coefficient (b) is interpreted as the rate of change in the "log odds" as X changes … not very useful.

• exp(b) is the effect of the independent variable on the odds, more useful for calculating the size of an effect

nnxbxbxbbypyp

...)(1

)(lnlogit 22110

nnxbxbxbb eeeeypyp

...)(1

)(Odds 22110

Page 45: Logistic Regression Analysis Gerrit Rooks 30-03-10

Magnitude of association: Percentage change in odds

• (Exponentiated coefficienti - 1.0) * 100

event

event

prob1probOddsi

Probability Odds

25% 0.33

50% 1

75% 3

Page 46: Logistic Regression Analysis Gerrit Rooks 30-03-10

Variables in the Equation

B S.E. Wald df Sig. Exp(B)

Step 1a age ,111 ,024 21,254 1 ,000 1,117

Constant -5,309 1,134 21,935 1 ,000 ,005

a. Variable(s) entered on step 1: age.

46

• For our age variable:– Percentage change in odds = (exponentiated coefficient – 1) * 100 = 12%– A one unit increase in previous will result in 12% increase in the odds that

the person will have a CHD– So if a soccer player is one year older, the odds that (s)he will have CHD is

12% higher

Magnitude of association

Page 47: Logistic Regression Analysis Gerrit Rooks 30-03-10

Another way: Calculating predicted probabilities

)11.3.5( 111)|Pr( Xe

XY

So, for somebody 20 years old, the predicted probability is .04

For somebody 70 years old, the predicted probability is .91

Page 48: Logistic Regression Analysis Gerrit Rooks 30-03-10

Checking assumptions

• Influential data points & Residuals– Follow Samanthas tips

• Hosmer & Lemeshow– Divides sample in subgroups– Checks whether there are differences between

observed and predicted between subgroups– Test should not be significant, if so: indication of

lack of fit

Page 49: Logistic Regression Analysis Gerrit Rooks 30-03-10

Hosmer & Lemeshow

Test divides sample in subgroups, checks whether difference between observed and predicted is about equal in these groups

Test should not be significant (indicating no difference)

Page 50: Logistic Regression Analysis Gerrit Rooks 30-03-10

Examining residuals in lR

1. Isolate points for which the model fits poorly2. Isolate influential data points

Page 51: Logistic Regression Analysis Gerrit Rooks 30-03-10

Residual statistics

Page 52: Logistic Regression Analysis Gerrit Rooks 30-03-10

Cooks distance

Means square errorNumber of parameter

Prediction for j from all observations

Prediction for j for observations excludingobservation i

Page 53: Logistic Regression Analysis Gerrit Rooks 30-03-10

53

Illustration with SPSS

• Penalty kicks data, variables:– Scored: outcome variable,

• 0 = penalty missed, and 1 = penalty scored– Pswq: degree to which a player worries– Previous: percentage of penalties scored by a

particulare player in their career

Page 54: Logistic Regression Analysis Gerrit Rooks 30-03-10

54

Case Processing Summary

75 100,00 ,0

75 100,00 ,0

75 100,0

Unweighted Casesa

Included in AnalysisMissing CasesTotal

Selected Cases

Unselected CasesTotal

N Percent

If weight is in effect, see classification table for the totalnumber of cases.

a.

Dependent Variable Encoding

01

Original ValueMissed PenaltyScored Penalty

Internal Value

SPSS OUTPUT Logistic Regression

Tells you somethingabout the number of observations and missings

Page 55: Logistic Regression Analysis Gerrit Rooks 30-03-10

55

Classification Tablea,b

0 35 ,00 40 100,0

53,3

ObservedMissed PenaltyScored Penalty

Result of PenaltyKick

Overall Percentage

Step 0

MissedPenalty

ScoredPenalty

Result of Penalty KickPercentage

Correct

Predicted

Constant is included in the model.a.

The cut value is ,500b.

Variables in the Equation

,134 ,231 ,333 1 ,564 1,143ConstantStep 0B S.E. Wald df Sig. Exp(B)

Variables not in the Equation

34,109 1 ,00034,193 1 ,00041,558 2 ,000

previouspswq

Variables

Overall Statistics

Step0

Score df Sig.

Block 0: Beginning Blockthis table is based on the empty model, i.e. onlythe constant in the model

)( 011)(P be

Y

these variableswill be enteredin the modellater on

Page 56: Logistic Regression Analysis Gerrit Rooks 30-03-10

56

Block 1: Method = Enter

Omnibus Tests of Model Coefficients

54,977 2 ,00054,977 2 ,00054,977 2 ,000

StepBlockModel

Step 1Chi-square df Sig.

Model Summary

48,662a ,520 ,694Step1

-2 Loglikelihood

Cox & SnellR Square

NagelkerkeR Square

Estimation terminated at iteration number 6 becauseparameter estimates changed by less than ,001.

a.

)]baseline()New([22 LLLL

Block is useful to check significance of individual coefficients, see Field

New model

this is the test statistic

after dividing by -2

Note: Nagelkerkeis larger than Cox

Page 57: Logistic Regression Analysis Gerrit Rooks 30-03-10

57

Variables in the Equation

,065 ,022 8,609 1 ,003 1,067-,230 ,080 8,309 1 ,004 ,7941,280 1,670 ,588 1 ,443 3,598

previouspswqConstant

Step1

a

B S.E. Wald df Sig. Exp(B)

Variable(s) entered on step 1: previous, pswq.a.

Classification Tablea

30 5 85,77 33 82,5

84,0

ObservedMissed PenaltyScored Penalty

Result of PenaltyKick

Overall Percentage

Step 1

MissedPenalty

ScoredPenalty

Result of Penalty KickPercentage

Correct

Predicted

The cut value is ,500a.

Block 1: Method = Enter (Continued)

Predictive accuracy has improved (was 53%)

estimatesstandard errorestimates

significance based on Wald statistic

change in odds

Page 58: Logistic Regression Analysis Gerrit Rooks 30-03-10

58

Variables in the Equation

,065 ,022 8,609 1 ,003 1,067-,230 ,080 8,309 1 ,004 ,7941,280 1,670 ,588 1 ,443 3,598

previouspswqConstant

Step1

a

B S.E. Wald df Sig. Exp(B)

Variable(s) entered on step 1: previous, pswq.a.

Classification Tablea

30 5 85,77 33 82,5

84,0

ObservedMissed PenaltyScored Penalty

Result of PenaltyKick

Overall Percentage

Step 1

MissedPenalty

ScoredPenalty

Result of Penalty KickPercentage

Correct

Predicted

The cut value is ,500a.

How is the classification table constructed?

)*230,0*065,028,1(11)(P Pred. pswqpreviouse

Y

# cases not predictedcorrrectly

# cases not predictedcorrrectly

Page 59: Logistic Regression Analysis Gerrit Rooks 30-03-10

59

How is the classification table constructed?

)*230,0*065,028,1(11)(P Pred. pswqpreviouse

Y

pswq previous scored Predict. prob.

18 56 1 .68

17 35 1 .41

20 45 0 .40

10 42 0 .85

Page 60: Logistic Regression Analysis Gerrit Rooks 30-03-10

60

How is the classification table constructed?

pswq previous

scored Predict. prob.

predicted

18 56 1 .68 117 35 1 .41 020 45 0 .40 010 42 0 .85 1

Classification Tablea

30 5 85,77 33 82,5

84,0

ObservedMissed PenaltyScored Penalty

Result of PenaltyKick

Overall Percentage

Step 1

MissedPenalty

ScoredPenalty

Result of Penalty KickPercentage

Correct

Predicted

The cut value is ,500a.