networks

34

Upload: chione

Post on 22-Jan-2016

22 views

Category:

Documents


0 download

DESCRIPTION

Networks. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Networks
Page 2: Networks

C ase 1: Tim e constant is long ==> The cortica l ce ll in tegrates “n icely”.In order to prevent it from firing at too h igh frequenciesthe threshold m ust be h igh. (about 100-200 tim es the EPSP am plitude.

C ase 2: Tim e constant is short ==> The cortica l ce ll in tegrates badly.Instead now it reacts sensitive ly to the coincidence betw een d ifferent inputs. As soon as m any inputs arise sim ultaneously the m em brane potentia l w ill rise steeply and the cell fires. If the input is not synchronized the cell rem ains silent.

Page 3: Networks
Page 4: Networks
Page 5: Networks
Page 6: Networks
Page 7: Networks
Page 8: Networks
Page 9: Networks
Page 10: Networks
Page 11: Networks
Page 12: Networks
Page 13: Networks
Page 14: Networks
Page 15: Networks
Page 16: Networks
Page 17: Networks
Page 18: Networks
Page 19: Networks
Page 20: Networks
Page 21: Networks
Page 22: Networks
Page 23: Networks

A ssum e w e have a sm art learn ing a lgorithm w hich produces w e ights A suchtha t the associa tion x1 ,y1 and x2 ,y2 is learnt.Then, how ever, the association fo r x3 is frozen at y3.A ny a ttem pt to d issocia te x3 ,y3 w ould unequivoca lly a lso lead to d istortiono f the associations x1,y1 and x2 ,y2 .

C onclusion : These netw orks a llow independent associations O N LY forlinear independent input vectors. In an n-d im ensiona l netw ork m axim allyn input vectors can be linear independent ! This lim its the usefu lness o f such netw orks trem endously.

How to solve this problem ?Introduce non-linearities and more layers !

The Perceptron Problem

(Minsky and Papert,

1969)

Page 24: Networks

H ave you seen th is prob lem before ???

Page 25: Networks
Page 26: Networks
Page 27: Networks
Page 28: Networks
Page 29: Networks
Page 30: Networks
Page 31: Networks
Page 32: Networks
Page 33: Networks
Page 34: Networks

no t as good !m em ory too fu ll !

If an input pattern that has previously been learned is incom plete or faulty, an auto-associator can complete or restore it.