basics of neural network programming - deep learningcs230.stanford.edu/files/c1m2_old.pdf · andrew...
TRANSCRIPT
BasicsofNeuralNetworkProgramming
BinaryClassificationdeeplearning.ai
AndrewNg
1 (cat) vs 0 (non cat)
255 134 93 22123 94 83 234 44 187 3034 76 232 12467 83 194 142
255 134 202 22123 94 83 434 44 187 19234 76 232 3467 83 194 94
255 231 42 22123 94 83 234 44 187 9234 76 232 12467 83 194 202
RedGreen
Blue
Binary Classification
AndrewNg
Notation
Basics of NeuralNetwork Programming
LogisticRegressiondeeplearning.ai
AndrewNg
Logistic Regression
Basics of NeuralNetwork Programming
LogisticRegression
cost functiondeeplearning.ai
AndrewNg
Logistic Regression cost function•
Loss (error) function:
Basics of NeuralNetwork Programming
LogisticRegression
cost functiondeeplearning.ai
AndrewNg
Logistic Regression cost function•
Loss (error) function:
BasicsofNeuralNetworkProgramming
GradientDescentdeeplearning.ai
AndrewNg
Gradient DescentRecap: !" = % &'( + * , % + = ,
,-./0
1 &, * =1456
78,ℒ(!" 7 , !(7)) =− 1
456
78,!(7) log !" 7 + (1 − !(7)) log(1 − !" 7 )
Want to find &, * that minimize 1 &, *
*
1 &, *
&
AndrewNg
Gradient Descent
&
BasicsofNeuralNetworkProgramming
LogisticRegressionasaNeuralNetworkdeeplearning.ai
AndrewNg
Logistic Regression as a Neural Network!" = % &'( + * ,where % + = ,
,-./0
Basics of NeuralNetwork Programming
Derivativesdeeplearning.ai
AndrewNg
Intuition about derivatives
Basics of NeuralNetwork Programming
More derivativesexamplesdeeplearning.ai
AndrewNg
Intuition about derivatives
AndrewNg
More derivative examples
Basics of NeuralNetwork Programming
Computation Graphdeeplearning.ai
AndrewNg
Computation Graph
Basics of NeuralNetwork Programming
Derivatives with aComputation Graphdeeplearning.ai
AndrewNg
Computing derivatives
611 33
AndrewNg
Computing derivatives
611 33
Basics of NeuralNetwork Programming
LogisticRegression
Gradient descentdeeplearning.ai
AndrewNg
Logistic regression recap
AndrewNg
Logistic regression derivatives
b
Basics of NeuralNetwork Programming
Gradient descenton m examplesdeeplearning.ai
AndrewNg
Logistic regression on m examples
AndrewNg
Logistic regression on m examples
Basics of NeuralNetwork Programming
Vectorizationdeeplearning.ai
AndrewNg
What is vectorization?
Basics of NeuralNetwork Programming
More vectorizationexamplesdeeplearning.ai
AndrewNg
Neural network programming guidelineWhenever possible, avoid explicit for-loops.
AndrewNg
Neural network programming guidelineWhenever possible, avoid explicit for-loops.
AndrewNg
Vectors and matrix valued functionsSay you need to apply the exponential operation on every element of amatrix/vector.
u[i]=math.exp(v[i])
u = np.zeros((n,1)) for i in range(n):
AndrewNg
Logistic regression derivativesJ = 0, dw1 = 0, dw2 = 0, db = 0
for i = 1 ton:
Basics of NeuralNetwork Programming
VectorizingLogistic
Regressiondeeplearning.ai
AndrewNg
Vectorizing Logistic Regression
Basics of NeuralNetwork Programming
Vectorizing LogisticRegression’s
GradientComputation
deeplearning.ai
AndrewNg
Vectorizing Logistic Regression
AndrewNg
Implementing Logistic Regression
for i = 1 tom:
BasicsofNeuralNetworkProgramming
BroadcastinginPythondeeplearning.ai
Broadcasting example
cal = A.sum(axis = 0)percentage = 100*A/(cal.reshape(1,4))
Apples Beef Eggs PotatoesCarb
Fat
56.0 0.0 4.4 68.01.2 104.0 52.0 8.01.8 135.0 99.0 0.9
Protein
CaloriesfromCarbs,Proteins,Fatsin100gofdifferentfoods:
101 102 103204 205 206
1 2 34 5 6
100200+ =
101 202 303104 205 306
100 200 3001 2 34 5 6 + =
1234
100101102103104
+ =
Broadcasting example
General Principle
BasicsofNeuralNetworkProgramming
Explanationoflogisticregressioncostfunction
(Optional)deeplearning.ai
AndrewNg
Logistic regression cost function
AndrewNg
If$ = 1: ( $ ) = $*If$ = 0: ( $ ) = 1 − $*
Logistic regression cost function
AndrewNg
Cost on m examples