Transcript
Page 1: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Logistic Regression,Generative and Discriminative Classifiers

Recommended reading:

• Ng and Jordan paper “On Discriminative vs. Generative classifiers: A comparison of logistic regression and naïve Bayes,” A. Ng and M. Jordan, NIPS 2002.

Machine Learning 10-701

Tom M. MitchellCarnegie Mellon University

Thanks to Ziv Bar-Joseph, Andrew Moore for some slides

Page 2: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

OverviewLast lecture:• Naïve Bayes classifier• Number of parameters to estimate• Conditional independence

This lecture: • Logistic regression• Generative and discriminative classifiers• (if time) Bias and variance in learning

Page 3: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers
Page 4: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Generative vs. Discriminative Classifiers

Training classifiers involves estimating f: X Y, or P(Y|X)

Generative classifiers:

• Assume some functional form for P(X|Y), P(X)

• Estimate parameters of P(X|Y), P(X) directly from training data

• Use Bayes rule to calculate P(Y|X= xi)

Discriminative classifiers:

1. Assume some functional form for P(Y|X)

2. Estimate parameters of P(Y|X) directly from training data

Page 5: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

• Consider learning f: X Y, where

• X is a vector of real-valued features, < X1 … Xn >

• Y is boolean

• So we use a Gaussian Naïve Bayes classifier

• assume all Xi are conditionally independent given Y

• model P(Xi | Y = yk) as Gaussian N(µik,σ)

• model P(Y) as binomial (p)

• What does that imply about the form of P(Y|X)?

Page 6: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

• Consider learning f: X Y, where

• X is a vector of real-valued features, < X1 … Xn >

• Y is boolean

• assume all Xi are conditionally independent given Y

• model P(Xi | Y = yk) as Gaussian N(µik,σ)

• model P(Y) as binomial (p)

• What does that imply about the form of P(Y|X)?

Page 7: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Logistic regression• Logistic regression represents the probability

of category i using a linear function of the input variables:

where for i<k

and for k

)()|( 110 didii xwxwwgxXiYP +++=== K

∑−

=

+= 1

1

1)( K

j

z

z

ij

i

e

ezg

∑−

=

+= 1

1

1

1)( K

j

zk

jezg

Page 8: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Logistic regression• The name comes from the logit transformation:

didik

i xwxwwzgzg

xXKYpxXiYp

+++======

K110)()(log

)|()|(log

Page 9: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Binary logistic regression• We only need one set of parameters

• This results in a “squashing function” which turns linear predictions into probabilities

z

xwxww

xwxww

xwxww

e

e

eexXYp

dd

dd

dd

+++−

+++

+++

+=

+=

+===

11

11

1)|1(

)( 110

110

110

K

K

K

Page 10: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Logistic regression vs. Linear regression

zexXYP −+

===1

1)|1(

Page 11: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Example

Page 12: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Log likelihood

=

=

=

+−=

++

−=

−−+=

N

iwx

ii

N

i wxi

ii

N

i iiii

i

i

ewxy

ewxpwxpy

wxpywxpywl

1

1

1

)1log(

)1

1log();(1(

);(log

));(1log()1();(log)(

Page 13: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Log likelihood

=

=

=

+−=

++

−=

−−+=

N

iwx

ii

N

i wxi

ii

N

i iiii

i

i

ewxy

ewxpwxpy

wxpywxpywl

1

1

1

)1log(

)1

1log();(1(

);(log

));(1log()1();(log)(

• Note: this likelihood is a concave in w

Page 14: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Maximum likelihood estimation

=

=

−=

=

+−∂∂

=∂∂

N

i iiij

N

iwx

iijj

wxpyx

ewxyw

wlw

i

1

1

)),((

)}1log({)(

K

prediction error

Common (but not only) approaches:

Numerical Solutions:• Line Search• Simulated Annealing• Gradient Descent• Newton’s Method• Matlab glmfit function

No close form solution!

Page 15: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Gradient descent

Page 16: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Gradient ascent)),(((1 wxpyxww iiij

i

tj

tj −+← ∑+ ε

• Iteratively updating the weights in this fashion increases likelihood each round.

• We eventually reach the maximum

• We are near the maximum when changes in the weights are small.

• Thus, we can stop when the sum of the absolute values of the weight differences is less than some small number.

Page 17: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Example• We get a monotonically increasing log likelihood of

the training labels as a function of the iterations

Page 18: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Convergence• The gradient ascent learning method

converges when there is no incentive to move the parameters in any particular direction:

kwxpyx iiiji

∀=−∑ 0)),(((

• This condition means that the prediction error is uncorrelated with the components of the input vector

Page 19: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Naïve Bayes vs. Logistic Regression• Generative and Discriminative classifiers

• Asymptotic comparison (# training examples infinity)

• when model correct

• when model incorrect

• Non-asymptotic analysis

• convergence rate of parameter estimates

• convergence rate of expected error

• Experimental results

[Ng & Jordan, 2002]

Page 20: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Generative-Discriminative PairsExample: assume Y boolean, X = <X1, X2, …, Xn>, where xi are boolean,

perhaps dependent on Y, conditionally independent given Y

Generative model: naïve Bayes:

Classify new example x based on ratio

Equivalently, based on sign of log of this ratio

s indicates size of set.

l is smoothing parameter

Page 21: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Generative-Discriminative PairsExample: assume Y boolean, X = <x1, x2, …, xn>, where xi are boolean,

perhaps dependent on Y, conditionally independent given Y

Generative model: naïve Bayes:

Classify new example x based on ratio

Discriminative model: logistic regression

Note both learn linear decision surface over X in this case

Page 22: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

What is the difference asymptotically?Notation: let denote error of hypothesis learned via

algorithm A, from m examples

• If assumed model correct (e.g., naïve Bayes model), and finite number of parameters, then

• If assumed model incorrect

Note assumed discriminative model can be correct even when generative model incorrect, but not vice versa

Page 23: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Rate of covergence: logistic regression

Let hDis,m be logistic regression trained on m examples in ndimensions. Then with high probability:

Implication: if we want

for some constant , it suffices to pick

Convergences to its classifier, in order of n examples

(result follows from Vapnik’s structural risk bound, plus fact that VCDim of n dimensional linear separators is n )

Page 24: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Rate of covergence: naïve BayesConsider first how quickly parameter estimates converge toward their asymptotic values.

Then we’ll ask how this influences rate of convergence toward asymptotic classification error.

Page 25: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Rate of covergence: naïve Bayes parameters

Page 26: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Some experiments from UCI data sets

Page 27: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

What you should know:

• Logistic regression– What it is– How to solve it– Log linear models

• Generative and Discriminative classifiers– Relation between Naïve Bayes and logistic regression– Which do we prefer, when?

• Bias and variance in learning algorithms

Page 28: Logistic Regression, Generative and Discriminative Classifiersguestrin/Class/10701-S05/slides/LogRegress-1-24-0… · Logistic Regression, Generative and Discriminative Classifiers

Acknowledgment Some of these slides are based in part on slides from previous machine learning classes taught by Ziv Bar-Joseph, Andrew Moore at CMU, and by Tommi Jaakkola at MIT.

I thank them for providing use of their slides.


Top Related