neuromorphic computing in cmos: digital, analog … lab 1 shreyas sen, ayan biswas, priyadarshini...

15
1 SPARC Lab Shreyas Sen, Ayan Biswas, Priyadarshini Panda, Kaushik Roy ECE, Purdue University Sep 27, 2016 Neuromorphic Computing in CMOS: Digital, Analog or Mixed-Signal ?

Upload: trantuyen

Post on 09-May-2018

218 views

Category:

Documents


3 download

TRANSCRIPT

1 SPARC Lab

Shreyas Sen, Ayan Biswas, Priyadarshini Panda,

Kaushik Roy

ECE, Purdue University

Sep 27, 2016

Neuromorphic Computing in CMOS:

Digital, Analog or Mixed-Signal ?

2 SPARC Lab

Neuromorphic – Fundamental Question

Error-Resiliency Energy-efficiency

• Approximate Neurons ?

• Noisy Neurons ? • Digital, Analog or Mixed-Signal?

Neuron Architecture

3 SPARC Lab

Von-Neumann Computing Digital vs. Analog

[1]

Digital vs. Analog

[1] Sarpeshkar, Rahul. "Analog versus digital: extrapolating from electronics to neurobiology.“ Neural computation 10.7 (1998): 1601-1638.

4 SPARC Lab

Neuron Architecture - Digital

Multiplier Adder

Digital MAC (n=2)

Transistor Count

Thresholding

8b: High Static Leakage

5 SPARC Lab

Neuron Architecture – Mixed-Signal • Resistive Load

• No PMOS load to ensure

• 𝑹𝒍𝒐𝒂𝒅 ≠ 𝒇(𝑰𝒕𝒐𝒕𝒂𝒍)) • Large Signal Multiplication

Vout

V1

1X

2X2N-1X

N

w1

Rload

V2

1X

2X2N-1X

N

w2 Vn

1X

2X2N-1X

N

wn

𝒈𝒎 𝒈𝒎 𝒈𝒎

𝒊𝒃𝒊𝒂𝒔 𝒊𝒃𝒊𝒂𝒔 𝒊𝒃𝒊𝒂𝒔

𝑽𝒐𝒖𝒕 = 𝝈( 𝒘𝒌𝒈𝒎𝑽𝒌

𝒏

𝒌=𝟏

)

6 SPARC Lab

Comparison: Dig-N vs. MS-N

7 SPARC Lab

Error Resiliency vs. MS-N Noise

Quantization Noise

Thermal Noise

8 SPARC Lab

Conclusion

Frequency

Pre

cis

ion

Mixed-Signal

Neuron (LS)

Digital

Neuron

3b

8b

1MHz

9 SPARC Lab

THANK YOU

10 SPARC Lab

Dig-N: Noise and BW vs. Power

11 SPARC Lab

Dig-N: Noise and BW vs. Power

12 SPARC Lab

28x28

6@24x24 6@12x12 12@8x8 12@4x4

28x28

6@12x12 12@8x8 12@4x4 FCN

10 6@24x24

13 SPARC Lab

Prec vs error FCN (I784, 1H 50, 2H 100, O10)

Trained with high precision (16)

Scaled down (161)

CE as QN increases

5b it can hold accuracy

MNIST CNN

MNIST FCN

CIFAR CNN

500nA 100nA 10nA 1nA

25.6 28.6 35.4

44.5

500nA 100nA 10nA 1nA

500nA 100nA 10nA 1nA

14 SPARC Lab

Prec vs error FCN (I784, 1H 50, 2H 100, O10)

Trained with high precision (16)

Scaled down (161)

CE as QN increases

5b it can hold accuracy

MNIST FCN

500nA 100nA 10nA 1nA

CIFAR CNN

25.6 28.6 35.4

44.5

500nA 100nA 10nA 1nA

MNIST CNN

500nA 100nA 10nA 1nA

15 SPARC Lab

MNIST

FCN (I784, 1H 50, 2H 100, O10) CNN

Baseline error is low (good training)

Resilient to introduction of thermal noise

50mV(low SNR) diverges more

CNN is better trained – Shared

weight

More neuron, bigger network –

more error averaging

Includes retraining for 3b

12%