introduction to deep learning (i2dl)...i2dl: prof. niessner 1 today’s outline • neural networks...
TRANSCRIPT
![Page 1: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/1.jpg)
Introduction to Deep Learning (I2DL)
Exercise 5: Neural Networks and CIFAR10 Classification
I2DL: Prof. Niessner 1
![Page 2: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/2.jpg)
Today’s Outline
• Neural Networks
– Mathematical Motivation
– Modularization
• Exercise 5
– Implementation Loop
– CIFAR10 Classification
2I2DL: Prof. Niessner
![Page 3: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/3.jpg)
Our Goal
3
𝑓
I2DL: Prof. Niessner
![Page 4: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/4.jpg)
Universal Approximation Theorem
Theorem (1989, colloquial)For any continuous function 𝑓 on a compact set 𝐾, there exists a one layer neural network, having only a single hidden layer + sigmoid, which uniformly approximates 𝑓 to within an arbitrary 𝜀 > 0 on 𝐾.
4I2DL: Prof. Niessner
![Page 5: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/5.jpg)
Universal Approximation Theorem (Optional)
Readable proof:https://mcneela.github.io/machine_learning/2017/03/21/Universal-Approximation-Theorem.html(Background: Functional Analysis, Math Major 3rd semester)
Visual proof:http://neuralnetworksanddeeplearning.com/chap4.html
5I2DL: Prof. Niessner
![Page 6: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/6.jpg)
A word of warning…
6
Source: http://blog.datumbox.com/wp-content/uploads/2013/10/gradient-descent.png
I2DL: Prof. Niessner
![Page 7: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/7.jpg)
Shallow vs. Deep
• Shallow(1 hidden layer)
• Deep(>1 hidden layer)
7I2DL: Prof. Niessner
![Page 8: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/8.jpg)
Obvious Questions
• Q: Do we even need deep networks?A: Yes. Multiple layers allow for more representation
power given a fixed computational budget incomparison to a single layer
• Q: So we just build 100 layer deep networks?A: Not trivially ;-)
Contraints: Memory, vanishing gradients, …
8I2DL: Prof. Niessner
![Page 9: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/9.jpg)
Exercise 4: Simple Classification Net
9I2DL: Prof. Niessner
![Page 10: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/10.jpg)
Modularization
10I2DL: Prof. Niessner
![Page 11: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/11.jpg)
Exercise 3: Dataset
11
Data Model Solver
Dataset
Dataloader
Network
Loss/Objective
Optimizer
Training Loop
Validation
I2DL: Prof. Niessner
![Page 12: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/12.jpg)
Exercise 4: Binary Classification
12
Data Model Solver
Dataset
Dataloader
Network
Loss/Objective
Optimizer
Training Loop
Validation
I2DL: Prof. Niessner
![Page 13: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/13.jpg)
Exercise 5: Neural Networks
13
Data Model Solver
Dataset
Dataloader
Network
Loss/ObjectiveLoss/Objective
Optimizer
Training Loop
Validation
Optimizer
Training Loop
Validation
and CIFAR10 Classification
I2DL: Prof. Niessner
![Page 14: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/14.jpg)
Exercise 6: Neural Networks
14
Model
Network
Loss/ObjectiveLoss/Objective
Solver
Optimizer
Training Loop
Validation
Optimizer
Training Loop
Validation
and Hyperparameter Tuning
Data
Dataset
Dataloader
Dataset
Dataloader
I2DL: Prof. Niessner
![Page 15: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/15.jpg)
Summary
• Monday 17.05: Watch Lecture 6 – Training NNs
• Wednesday 19.05 15:59: Submit exercise 5
• Thursday 20.05: Tutorial 6– Hyper-parameter Tuning
I2DL: Prof. Niessner 15
![Page 16: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation](https://reader036.vdocuments.mx/reader036/viewer/2022071605/6140f33383382e045471c74a/html5/thumbnails/16.jpg)
See you next week
I2DL: Prof. Niessner 16