Download - Lec 7. Regularization.pdf
-
8/18/2019 Lec 7. Regularization.pdf
1/17
RegularizatThe problem
overfitting
Machine Learning
-
8/18/2019 Lec 7. Regularization.pdf
2/17
-
8/18/2019 Lec 7. Regularization.pdf
3/17
Example: Logistic regression
( = sigmoid function)
x1
x2
x1
x2
x
x2
-
8/18/2019 Lec 7. Regularization.pdf
4/17
-
8/18/2019 Lec 7. Regularization.pdf
5/17
Addressing overfitting:
Options:
1. Reduce number of features.― Manually select which features to keep.
― Model selection algorithm (later in course).
2. Regularization.
― Keep all the features, but reduce magnitude/
parameters .
― Works well when we have a lot of features, e
which contributes a bit to predicting .
-
8/18/2019 Lec 7. Regularization.pdf
6/17
RegularizatCost functi
Machine Learning
-
8/18/2019 Lec 7. Regularization.pdf
7/17
Intuition
Suppose we penalize and make , really small.
P r i c e
Size of house
P r i c e
Size of house
-
8/18/2019 Lec 7. Regularization.pdf
8/17
Small values for parameters
―
“Simpler” hypothesis― Less prone to overfitting
Regularization.
Housing:
― Features:
―
Parameters:
-
8/18/2019 Lec 7. Regularization.pdf
9/17
Regularization.
P r i c e
Size of house
-
8/18/2019 Lec 7. Regularization.pdf
10/17
In regularized linear regression, we choose to minimi
What if is set to an extremely large value (perhaps fo
for our problem, say )?
P r i c e
Size of house
-
8/18/2019 Lec 7. Regularization.pdf
11/17
RegularizatRegularized li
regression
Machine Learning
-
8/18/2019 Lec 7. Regularization.pdf
12/17
Regularized linear regression
-
8/18/2019 Lec 7. Regularization.pdf
13/17
Gradient descent
Repeat
-
8/18/2019 Lec 7. Regularization.pdf
14/17
Normal equation
-
8/18/2019 Lec 7. Regularization.pdf
15/17
-
8/18/2019 Lec 7. Regularization.pdf
16/17
Regularized logistic regression.
Cost function:
x1
x2
-
8/18/2019 Lec 7. Regularization.pdf
17/17
Gradient descent
Repeat