10/6/20151 iii. recurrent neural networks. 10/6/20152 a. the hopfield network
TRANSCRIPT
![Page 1: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/1.jpg)
04/19/23 1
III. Recurrent Neural Networks
![Page 2: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/2.jpg)
04/19/23 2
A.The Hopfield Network
![Page 3: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/3.jpg)
04/19/23 3
Typical Artificial Neuron
inputs
connectionweights
threshold
output
![Page 4: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/4.jpg)
04/19/23 4
Typical Artificial Neuron
linearcombination
net input(local field)
activationfunction
![Page 5: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/5.jpg)
04/19/23 5
Equations
€
hi = wijs j
j=1
n
∑ ⎛
⎝ ⎜ ⎜
⎞
⎠ ⎟ ⎟−θ
h = Ws −θ
Net input:
€
′ s i = σ hi( )
′ s = σ h( )
New neural state:
![Page 6: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/6.jpg)
04/19/23 6
Hopfield Network
• Symmetric weights: wij = wji
• No self-action: wii = 0
• Zero threshold: = 0
• Bipolar states: si {–1, +1}
• Discontinuous bipolar activation function:
€
σ h( ) = sgn h( ) =−1, h < 0
+1, h > 0
⎧ ⎨ ⎩
![Page 7: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/7.jpg)
04/19/23 7
What to do about h = 0?
• There are several options: σ(0) = +1 σ(0) = –1 σ(0) = –1 or +1 with equal probability hi = 0 no state change (si′ = si)
• Not much difference, but be consistent
• Last option is slightly preferable, since symmetric
![Page 8: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/8.jpg)
04/19/23 8
Positive Coupling
• Positive sense (sign)
• Large strength
![Page 9: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/9.jpg)
04/19/23 9
Negative Coupling
• Negative sense (sign)
• Large strength
![Page 10: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/10.jpg)
04/19/23 10
Weak Coupling• Either sense (sign)
• Little strength
![Page 11: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/11.jpg)
04/19/23 11
State = –1 & Local Field < 0
h < 0
![Page 12: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/12.jpg)
04/19/23 12
State = –1 & Local Field > 0
h > 0
![Page 13: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/13.jpg)
04/19/23 13
State Reverses
h > 0
![Page 14: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/14.jpg)
04/19/23 14
State = +1 & Local Field > 0
h > 0
![Page 15: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/15.jpg)
04/19/23 15
State = +1 & Local Field < 0
h < 0
![Page 16: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/16.jpg)
04/19/23 16
State Reverses
h < 0
![Page 17: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/17.jpg)
04/19/23 17
NetLogo Demonstration of Hopfield State Updating
Run Hopfield-update.nlogo
![Page 18: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/18.jpg)
04/19/23 18
Hopfield Net as Soft Constraint Satisfaction System
• States of neurons as yes/no decisions• Weights represent soft constraints between
decisions– hard constraints must be respected– soft constraints have degrees of importance
• Decisions change to better respect constraints
• Is there an optimal set of decisions that best respects all constraints?
![Page 19: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/19.jpg)
04/19/23 19
Demonstration of Hopfield Net Dynamics I
Run Hopfield-dynamics.nlogo
![Page 20: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/20.jpg)
04/19/23 20
Convergence
• Does such a system converge to a stable state?
• Under what conditions does it converge?
• There is a sense in which each step relaxes the “tension” in the system
• But could a relaxation of one neuron lead to greater tension in other places?
![Page 21: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/21.jpg)
04/19/23 21
Quantifying “Tension”• If wij > 0, then si and sj want to have the same sign
(si sj = +1)• If wij < 0, then si and sj want to have opposite signs
(si sj = –1)• If wij = 0, their signs are independent• Strength of interaction varies with |wij|• Define disharmony (“tension”) Dij between neurons
i and j:Dij = – si wij sj
Dij < 0 they are happy
Dij > 0 they are unhappy
![Page 22: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/22.jpg)
04/19/23 22
Total Energy of SystemThe “energy” of the system is the total
“tension” (disharmony) in it:
€
E s{ } = Dij
ij
∑ = − siwijs j
ij
∑
= − 12 siwijs j
j≠ i
∑i
∑
= − 12
j
∑ siwijs j
i
∑
= − 12 sT Ws
![Page 23: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/23.jpg)
04/19/23 23
Review of Some Vector Notation
€
x =
x1
M
xn
⎛
⎝
⎜ ⎜ ⎜
⎞
⎠
⎟ ⎟ ⎟= x1 L xn( )
T
(column vectors)
€
xT y = x iy i = x ⋅yi=1
n
∑ (inner product)
€
xyT =
x1y1 L x1yn
M O M
xm y1 L xm yn
⎛
⎝
⎜ ⎜ ⎜
⎞
⎠
⎟ ⎟ ⎟
(outer product)
€
xTMy = x iM ij y jj=1
n
∑i=1
m
∑ (quadratic form)
![Page 24: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/24.jpg)
04/19/23 24
Another View of EnergyThe energy measures the number of neurons
whose states are in disharmony with their local fields (i.e. of opposite sign):
€
E s{ } = − 12 siwijs j
j
∑i
∑
= − 12 si wijs j
j
∑i
∑
= − 12 sihi
i
∑
= − 12 sTh
![Page 25: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/25.jpg)
04/19/23 25
Do State Changes Decrease Energy?• Suppose that neuron k changes state• Change of energy:
€
ΔE = E ′ s { } − E s{ }
= − ′ s iwij ′ s j + siwijs j
ij
∑ij
∑
€
=− ′ s kwkj
j≠k
∑ s j + skwkjs j
j≠k
∑
€
=− ′ s k − sk( ) wkjs j
j≠k
∑
€
=−Δskhk
€
< 0
![Page 26: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/26.jpg)
04/19/23 26
Energy Does Not Increase
• In each step in which a neuron is considered for update:E{s(t + 1)} – E{s(t)} 0
• Energy cannot increase
• Energy decreases if any neuron changes
• Must it stop?
![Page 27: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/27.jpg)
04/19/23 27
Proof of Convergencein Finite Time
• There is a minimum possible energy:– The number of possible states s {–1, +1}n is
finite– Hence Emin = min {E(s) | s {1}n} exists
• Must show it is reached in a finite number of steps
![Page 28: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/28.jpg)
04/19/23 28
Steps are of a Certain Minimum Size
€
hmin = min hk s.t. hk = wkjs j ∧s∈ {±1}n
j≠k
∑ ⎧ ⎨ ⎪
⎩ ⎪
⎫ ⎬ ⎪
⎭ ⎪
€
ΔE = 2 hk ≥ 2hmin
![Page 29: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/29.jpg)
04/19/23 29
Conclusion
• If we do asynchronous updating, the Hopfield net must reach a stable, minimum energy state in a finite number of updates
• This does not imply that it is a global minimum
![Page 30: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/30.jpg)
04/19/23 30
Lyapunov Functions• A way of showing the convergence of discrete-
or continuous-time dynamical systems
• For discrete-time system:– need a Lyapunov function E (“energy” of the state)
– E is bounded below (E{s} > Emin)
– ΔE < (ΔE)max 0 (energy decreases a certain minimum amount each step)
– then the system will converge in finite time
• Problem: finding a suitable Lyapunov function
![Page 31: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/31.jpg)
04/19/23 31
Example Limit Cycle with Synchronous Updating
w > 0 w > 0
![Page 32: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/32.jpg)
04/19/23 32
The Hopfield Energy Function is Even
• A function f is odd if f (–x) = – f (x),for all x
• A function f is even if f (–x) = f (x),for all x
• Observe:
€
E −s{ } = − 12 (−s)T W(−s) = − 1
2 sT Ws = E s{ }
![Page 33: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/33.jpg)
04/19/23 33
Conceptual Picture of
Descent on Energy Surface
(fig. from Solé & Goodwin)
![Page 34: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/34.jpg)
04/19/23 34
Energy Surface
(fig. from Haykin Neur. Netw.)
![Page 35: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/35.jpg)
04/19/23 35
Energy Surface
+Flow Lines
(fig. from Haykin Neur. Netw.)
![Page 36: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/36.jpg)
04/19/23 36
Flow Lines
(fig. from Haykin Neur. Netw.)
Basins ofAttraction
![Page 37: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/37.jpg)
04/19/23 37
Bipolar State Space
![Page 38: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/38.jpg)
04/19/23 38
Basins in
Bipolar State Space
energy decreasing paths
![Page 39: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/39.jpg)
04/19/23 39
Demonstration of Hopfield Net Dynamics II
Run initialized Hopfield.nlogo
![Page 40: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/40.jpg)
04/19/23 40
Storing Memories as
Attractors
(fig. from Solé & Goodwin)
![Page 41: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/41.jpg)
04/19/23 41
Example of Pattern
Restoration
(fig. from Arbib 1995)
![Page 42: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/42.jpg)
04/19/23 42
Example of Pattern
Restoration
(fig. from Arbib 1995)
![Page 43: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/43.jpg)
04/19/23 43
Example of Pattern
Restoration
(fig. from Arbib 1995)
![Page 44: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/44.jpg)
04/19/23 44
Example of Pattern
Restoration
(fig. from Arbib 1995)
![Page 45: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/45.jpg)
04/19/23 45
Example of Pattern
Restoration
(fig. from Arbib 1995)
![Page 46: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/46.jpg)
04/19/23 46
Example of Pattern
Completion
(fig. from Arbib 1995)
![Page 47: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/47.jpg)
04/19/23 47
Example of Pattern
Completion
(fig. from Arbib 1995)
![Page 48: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/48.jpg)
04/19/23 48
Example of Pattern
Completion
(fig. from Arbib 1995)
![Page 49: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/49.jpg)
04/19/23 49
Example of Pattern
Completion
(fig. from Arbib 1995)
![Page 50: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/50.jpg)
04/19/23 50
Example of Pattern
Completion
(fig. from Arbib 1995)
![Page 51: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/51.jpg)
04/19/23 51
Example of Association
(fig. from Arbib 1995)
![Page 52: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/52.jpg)
04/19/23 52
Example of Association
(fig. from Arbib 1995)
![Page 53: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/53.jpg)
04/19/23 53
Example of Association
(fig. from Arbib 1995)
![Page 54: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/54.jpg)
04/19/23 54
Example of Association
(fig. from Arbib 1995)
![Page 55: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/55.jpg)
04/19/23 55
Example of Association
(fig. from Arbib 1995)
![Page 56: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/56.jpg)
04/19/23 56
Applications ofHopfield Memory
• Pattern restoration
• Pattern completion
• Pattern generalization
• Pattern association
![Page 57: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/57.jpg)
04/19/23 57
Hopfield Net for Optimization and for Associative Memory
• For optimization:– we know the weights (couplings)– we want to know the minima (solutions)
• For associative memory:– we know the minima (retrieval states)– we want to know the weights
![Page 58: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/58.jpg)
04/19/23 58
Hebb’s Rule
“When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.”
—Donald Hebb (The Organization of Behavior, 1949, p. 62)
![Page 59: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/59.jpg)
04/19/23 59
Example of Hebbian Learning:Pattern Imprinted
![Page 60: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/60.jpg)
04/19/23 60
Example of Hebbian Learning:Partial Pattern Reconstruction
![Page 61: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/61.jpg)
04/19/23 61
Mathematical Model of Hebbian Learning for One Pattern
€
Let W ij =x ix j , if i ≠ j
0, if i = j
⎧ ⎨ ⎩
€
Since x ix i = x i2 =1,
€
W = xxT − I
For simplicity, we will include self-coupling:
€
W = xxT
![Page 62: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/62.jpg)
04/19/23 62
A Single Imprinted Pattern is a Stable State
• Suppose W = xxT
• Then h = Wx = xxTx = nx since
• Hence, if initial state is s = x, then new state is s′ = sgn (n x) = x
• May be other stable states (e.g., –x)€
xT x = x i2 = ±1( )
2
i=1
n
∑ = ni=1
n
∑
![Page 63: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/63.jpg)
04/19/23 63
Questions
• How big is the basin of attraction of the imprinted pattern?
• How many patterns can be imprinted?
• Are there unneeded spurious stable states?
• These issues will be addressed in the context of multiple imprinted patterns
![Page 64: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/64.jpg)
04/19/23 64
Imprinting Multiple Patterns
• Let x1, x2, …, xp be patterns to be imprinted
• Define the sum-of-outer-products matrix:
€
W ij = 1n x i
k x jk
k=1
p
∑
W = 1n x k x k
( )T
k=1
p
∑
![Page 65: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/65.jpg)
04/19/23 65
Definition of CovarianceConsider samples (x1, y1), (x2, y2), …, (xN, yN)
€
Let x = x k and y = y k
€
Covariance of x and y values :
€
= x k y k − x y k − x k y + x ⋅ y
€
= x k y k − x y k − x k y + x ⋅ y
€
= x k y k − x ⋅ y − x ⋅ y + x ⋅ y
€
Cxy = x k y k − x ⋅ y
€
Cxy = x k − x ( ) y k − y ( )
![Page 66: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/66.jpg)
04/19/23 66
Weights & the Covariance MatrixSample pattern vectors: x1, x2, …, xp
Covariance of ith and jth components:
€
Cij = x ik x j
k − x i ⋅ x j
€
If ∀ i : x i = 0 (±1 equally likely in all positions) :
€
Cij = x ik x j
k
€
= 1p x i
k y jk
k=1
p
∑
€
∴W = pn C
![Page 67: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/67.jpg)
04/19/23 67
Characteristicsof Hopfield Memory
• Distributed (“holographic”)– every pattern is stored in every location
(weight)
• Robust– correct retrieval in spite of noise or error in
patterns– correct operation in spite of considerable
weight damage or noise
![Page 68: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/68.jpg)
04/19/23 68
Demonstration of Hopfield Net
Run Malasri Hopfield Demo
![Page 69: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/69.jpg)
04/19/23 69
Stability of Imprinted Memories
• Suppose the state is one of the imprinted patterns xm
• Then:
€
h = Wxm = 1n x k x k
( )T
k∑[ ]x
m
= 1n x k x k
( )T
xm
k∑
= 1n xm xm
( )T
xm + 1n x k x k
( )T
xm
k≠m
∑
= xm + 1n x k ⋅xm
( )xk
k≠m
∑
![Page 70: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/70.jpg)
04/19/23 70
Interpretation of Inner Products
• xk xm = n if they are identical– highly correlated
• xk xm = –n if they are complementary– highly correlated (reversed)
• xk xm = 0 if they are orthogonal– largely uncorrelated
• xk xm measures the crosstalk between patterns k and m
![Page 71: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/71.jpg)
04/19/23 71
Cosines and Inner products
€
u ⋅v = u v cosθuv
€
If u is bipolar, then u2
= u ⋅u = n
€
Hence, u ⋅v = n n cosθuv = ncosθuv
u
v
€
uv
€
Hence h = xm + x k cosθkm
k≠m
∑
![Page 72: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/72.jpg)
04/19/23 72
Conditions for Stability
€
Stability of entire pattern :
xm = sgn xm + x k cosθkm
k≠m
∑ ⎛
⎝ ⎜
⎞
⎠ ⎟
€
Stability of a single bit :
x im = sgn x i
m + x ik cosθkm
k≠m
∑ ⎛
⎝ ⎜
⎞
⎠ ⎟
![Page 73: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/73.jpg)
04/19/23 73
Sufficient Conditions for Instability (Case 1)
€
Suppose x im = −1. Then unstable if :
€
−1( ) + x ik cosθkm > 0
k≠m
∑
€
x ik cosθkm >1
k≠m
∑
![Page 74: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/74.jpg)
04/19/23 74
Sufficient Conditions for Instability (Case 2)
€
Suppose x im = +1. Then unstable if :
€
+1( ) + x ik cosθkm < 0
k≠m
∑
€
x ik cosθkm < −1
k≠m
∑
![Page 75: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/75.jpg)
04/19/23 75
Sufficient Conditions for Stability
€
x ik cosθkm
k≠m
∑ ≤1
The crosstalk with the sought pattern must be sufficiently small
![Page 76: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/76.jpg)
04/19/23 76
Capacity of Hopfield Memory
• Depends on the patterns imprinted
• If orthogonal, pmax = n
– but every state is stable trivial basins
• So pmax < n
• Let load parameter = p / n
equations
![Page 77: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/77.jpg)
04/19/23 77
Single Bit Stability Analysis• For simplicity, suppose xk are random• Then xk xm are sums of n random 1
binomial distribution ≈ Gaussian in range –n, …, +n with mean = 0 and variance σ2 = n
• Probability sum > t:
[See “Review of Gaussian (Normal) Distributions” on course website]
€
12 1− erf
t
2n
⎛
⎝ ⎜
⎞
⎠ ⎟
⎡
⎣ ⎢
⎤
⎦ ⎥
![Page 78: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/78.jpg)
04/19/23 78
Approximation of Probability
€
Let crosstalk Cim = 1
n x ik x k ⋅xm( )
k≠m
∑
€
We want Pr Cim >1{ } = Pr nCi
m > n{ }
€
Note : nCim = x i
k x jk x j
m
j=1
n
∑k=1k≠m
p
∑
€
A sum of n(p −1) ≈ np random ±1s
€
Variance σ 2 = np
![Page 79: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/79.jpg)
04/19/23 79
Probability of Bit Instability
€
Pr nCim > n{ } = 1
2 1− erfn
2np
⎛
⎝ ⎜
⎞
⎠ ⎟
⎡
⎣ ⎢ ⎢
⎤
⎦ ⎥ ⎥
= 12 1− erf n 2p( )[ ]
= 12 1− erf 1 2α( )[ ]
(fig. from Hertz & al. Intr. Theory Neur. Comp.)
![Page 80: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/80.jpg)
04/19/23 80
Tabulated Probability ofSingle-Bit Instability
Perror
0.1% 0.105
0.36% 0.138
1% 0.185
5% 0.37
10% 0.61
(table from Hertz & al. Intr. Theory Neur. Comp.)
![Page 81: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/81.jpg)
04/19/23 81
Spurious Attractors• Mixture states:
– sums or differences of odd numbers of retrieval states– number increases combinatorially with p– shallower, smaller basins– basins of mixtures swamp basins of retrieval states overload– useful as combinatorial generalizations?– self-coupling generates spurious attractors
• Spin-glass states:– not correlated with any finite number of imprinted patterns– occur beyond overload because weights effectively random
![Page 82: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/82.jpg)
04/19/23 82
Basins of Mixture States
€
xmix
€
x k2
€
x k1
€
x k3
€
x imix = sgn x i
k1 + x ik2 + x i
k3( )
![Page 83: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/83.jpg)
04/19/23 83
Fraction of Unstable Imprints(n = 100)
(fig from Bar-Yam)
![Page 84: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/84.jpg)
04/19/23 84
Number of Stable Imprints(n = 100)
(fig from Bar-Yam)
![Page 85: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/85.jpg)
04/19/23 85
Number of Imprints with Basins of Indicated Size (n = 100)
(fig from Bar-Yam)
![Page 86: 10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network](https://reader030.vdocuments.mx/reader030/viewer/2022032708/56649e8e5503460f94b91c6a/html5/thumbnails/86.jpg)
04/19/23 86
Summary of Capacity Results
• Absolute limit: pmax < cn = 0.138 n
• If a small number of errors in each pattern permitted: pmax n
• If all or most patterns must be recalled perfectly: pmax n / log n
• Recall: all this analysis is based on random patterns
• Unrealistic, but sometimes can be arrangedIII B