neural networks and fuzzy systems hopfield network a feedback neural network has feedback loops from...
TRANSCRIPT
Neural Networks and Fuzzy Systems
Hopfield Network• A feedback neural network has feedback loops from
its outputs to its inputs. The presence of such loops has a profound impact on the learning capability of the network.
• After applying a new input, the network output is calculated and fed back to adjust the input. This process is repeated until the outcome becomes constant.
• John Hopfield (1982)– Associative Memory via artificial neural networks– Optimisation
Neural Networks and Fuzzy Systems
Hopfield Network1
2
i
n
x10
x20
xi0
xn0
.
.
.
.
.
.
y1
y2
yi
yn
.
.
.
.
.
.
w11w21w41 w31
w12w22w42 w32
w13w23w43 w33
w14w24w44 w34
n
iiijj
jj
tywtx
txty
1
)1()(
0x if 1-
0x if 1sgn ;))(sgn()(
It is a dynamic system:x(0)→y(0) →x(1) →y(1)…. →y*
Neural Networks and Fuzzy Systems
Attractor• Attractor
If a state x(t) in a region S and t → ∞,, x(t) → x*
S is the attractive region.
if x*=desired state, x* is an attractor
if x*≠desired state, x* is a spurious attractor.
S
x(t)
x*
Neural Networks and Fuzzy Systems
Associative memory • Nature of associative memory
– part of information given– the rest of the pattern is recalled
• Hopfiled networks can be used as associative memory.– Design weight W so that
X*=the memorised patterns.– Can store more than one. Capacity
increases
x(0)
x(0)
x*
Neural Networks and Fuzzy Systems
Analogy with Optimisation• The location of bottom of the
bowl (X0) represents the stored pattern
• Ball’s initial position represents the partial knowledge
• In corrugated surface, we can store {X1, X2,…, Xn} as memories, and recall one which is closest to the initial state.
•Hopfield networks can also be used for optimisation:
1) Defining an energy E such that an attractor can minimise E
2) Difference from associative memory: expecting one/less attractor but large attractive region
Neural Networks and Fuzzy Systems
Two Types of Associative Memory• Autoassociative memory
Pattern Ii: Ii+Δ→Ii • Heteroassociative memory
Pattern pairs Ii → yi : Ii+Δ→yi
– Hopfield networks are used as autoassociative memory
Neural Networks and Fuzzy Systems
Hebbian Rule
• Original rule proposed by Hebb (1949)
The organization behaviorWhen an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes parts in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.That is, the correlation of activity between two cells is reinforced by increasing the synaptic strength between them.
Neural Networks and Fuzzy Systems
Hebbian Rule
If a neuron α and a neuronβ are “on” at the same time, their synaptic connection is strengthened. The next time some of them are activated they will activate each other.
α
β
Neural Networks and Fuzzy Systems
Hebbian RuleIn other words:1. If two neurons on either side of a synapse (connection) are activated
simultaneously (i.e. synchronously), then the strength of that synapse is selectively increased.Activation↑This rule is often supplemented by:
2. If two neurons on either side of a synapse are activated asynchronously, then that synapse is selectively weakened or eliminated. Activation↓
• Δwij = yixj
Neural Networks and Fuzzy Systems
Synaptic weights in Hopfield Networks• m Patterns:
• add individual weights together
• In matrix form, it is the outer product of the patterns:
mn
m
m
nn v
v
V
v
v
V
v
v
V 1
2
21
2
1
11
1 ;;;
0
1
ji
jivvw
m
k
kj
ki
ij
(to avoid self-feedback)
m
k
kTk mIVVW1
I is n×n identity matrix, Superscript T denotes a matrix transpose.
Neural Networks and Fuzzy Systems
Associative Memory of Hopfield Networks
• If V1,…,Vm are orthogonal, i.e. ViT Vj=0 for i≠j then Vl→Vl, l=1..m, if n>m.
• If V1,…,Vm are not orthogonal
)()sgn()1( , if
)(
1
tVWVtVmn
VmnmVnV
mVVVVWV
lll
lll
m
k
llkTkl
lk
lkTkl
m
k
llkTkl
VVVVmn
mVVVVWV
!
1
)(
Interference from other patterns, should be weaker than (n-m)
Neural Networks and Fuzzy Systems
Storage Capacity
• As the number of patterns (m) increases, the chances of accurate storage must decrease
• Hopfield’s empirical work in 1982– About half of the memories were stored accurately in a net
of N nodes if m = 0.15N
• McCliece’s analysis in 1987– If we require almost all the required memories to be stored
accurately, then the maximum number of patterns m is N/(2lnN)
– For N = 100, m = 11
Neural Networks and Fuzzy Systems
Limitations of Hopfield Net
• The number of patterns that can be stored and accurately recalled is severely limited– If too many patterns are stored, net may converge
to a novel spurious pattern : not matched output
• Exemplar pattern will be unstable if it shares many bits in common with another exemplar pattern
Neural Networks and Fuzzy Systems
Example: Location Recall
Front of a door: Ultrasonic sensors: V1=[1,1,1]
Into a door: Ultrasonic sensors: V1=[-1,-1,-1]
Neural Networks and Fuzzy Systems
An example of memorization
• Memorize the two states, (1,1,1) and (-1,-1,-1).
• Transposed form of these vectors:
• The 3 x 3 identity matrix is:
1
1
1
1Y
1
1
1
2Y
1111 TY 1112 TY
100
010
001
Neural Networks and Fuzzy Systems
Example Cont’d…
• The weight matrix is determined as follows:
• So,
IYYYYW TT 22211
022
202
220
100
010
001
2111
1
1
1
111
1
1
1
W
Next, the network is tested by the sequence of input vectors X1 and X2, which are equal to the output (or target) vectors Y1 and Y2, respectively.
Neural Networks and Fuzzy Systems
Example Cont’d…Network is tested.
• First activate the network by applying input vector X. Then, calculate the actual output vector Y, and finally, compare the result with the initial input vector X.
Assume all thresholds to be zero for this example. Thus,
MmWXsignY mm ,.......,2,1),(
1
1
1
1
1
1
022
202
220
1
1
1
1
1
1
022
202
220
2
1
signY
and
signY
Y1=X1 and Y2=X2, so both states, (1,1,1) and (-1,-1,-1). are said to be stable.
Neural Networks and Fuzzy Systems
Example Cont’d…Other Possible States
Possible state Iteration x1 x2 x3 y1 y2 y3 Fundamental mem.
1 1 1 0 1 1 1 1 1 1 1 1 1
-1 1 1 0 -1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 11 -1 1 0 1 -1 1 1 1 1
1 1 1 1 1 1 1 1 1 11 1 -1 0 1 1 -1 1 1 1
1 1 1 1 1 1 1 1 1 1-1 -1 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 -1-1 -1 1 0 -1 -1 1 -1 -1 -1
1 -1 -1 -1 -1 -1 -1 -1 -1 -1-1 1 -1 0 -1 1 -1 -1 -1 -1
1 -1 -1 -1 -1 -1 -1 -1 -1 -11 -1 -1 0 1 -1 -1 -1 -1 -1
1 -1 -1 -1 -1 -1 -1 -1 -1 -1
Inputs Outputs
Neural Networks and Fuzzy Systems
Example Cont’d…Error compared to fundamental memory
• The fundamental memory (1,1,1) attracts unstable states (-1,1,1), (1,-1,1) and (1,1,-1).
• The fundamental memory (-1,-1,-1) attracts unstable states (-1,-1,1), (-1,1,-1) and (1,-1,-1).
Neural Networks and Fuzzy Systems
Hopfield Network Training Algorithm
Step 1: Storage
The n-neuron Hopfield network is required to store a set of M fundamental memories, Y1, Y2,… YM. The synaptic weight from neuron i to neuron j is calculated as
where ym,i and ym,j are the ith and jth elements of the fundamental memory Ym , respectively.
ji
jiyyw
M
mjmim
ji
0
,!,1
,,,
Neural Networks and Fuzzy Systems
Hopfield Network Training Algorithm
In matrix form, the synaptic weights between neurons are represented as
The Hopfiled network can store a set of fundamental memories if the weight matrix is symmetrical, with zeros in its main diagonal.
M
m
Tmm MIYYW
1
0
0
0
0
21
21
2221
1112
ninn
inii
ni
ni
www
www
www
www
W
Where wij = wji. Once the
weights are calculated, they remain fixed.
Neural Networks and Fuzzy Systems
Hopfield Network Training Algorithm
Step 2: Testing
The network must recall any fundamental memory Ym when presented with it as an input.
n
jjmijim xwsigny
1,,
Mmniyx imim ,2,1;,,2,1,,,
where ym,i is the ith element of the actual vector Ym, and xm,j is the jth element of the input vector Xm. In matrix form,
)(
,,2,1,
mm
mm
WXsignY
MmYX
Neural Networks and Fuzzy Systems
Hopfield Network Training Algorithm
Step 3: Retrieval (If all fundamental memories are called perfectly proceed to this step.)
Present an unknown n-dimensional vector(probe), X, to the network and retrieve a stable state. That is,
a) Initialize the retrieval algorithm of the Hopfield network by setting
and calculate the initial state for each neuron
MmYX m ,,2,1,!
njxx jj ....,2,1)0(
nixwsignyn
jjiji ,....,2,1,)0()0(
1
Neural Networks and Fuzzy Systems
Hopfield Network Training Algorithm
Step 3: Retrieval
where xj(0) is the jth element of the probe vector X at iteration p=0, and yj(0) is the state of neuron i at iteration p=0.
In matrix form, the state vector at iteration p=0 is presented as
b) Update the elements of the state vector, Y(p), according to the following rule:
)0()0( WXsignY
))1(()1(
)()1(1
pxsignpy
pywpx
ii
n
jjiji
Neural Networks and Fuzzy Systems
Hopfield Network Training Algorithm
Step 3: Retrieval
Weights for updating are selected asynchronously, that is, randomly and one at a time.
Repeat the iteration until the state vector becomes unchanged, or in other words, a stable state is reached.
It can be proved:
The Hopfield network will always converge to a stable state when the retrieval operation is performed asynchronously, if wij=wji, and wii=0.
A stable state or fixed point: ))(()1(
1
n
jjiji pywsignpy
Neural Networks and Fuzzy Systems
Little model of Hopfield Network
Little model uses synchronous dynamics for retrival(Little and Shaw, 1975):
It can be proved:
The Little Model will always converge to a stable state or a limit cycle of length at most 2 if wij=wji.
It is very easy to be implemented by using matrix manipulation, such as in Matlab.
))(()1( pWYsignpY
Neural Networks and Fuzzy Systems
Little model of Hopfield Network
Example:
A stable state:
Converge to the stable state:
Limit Cycle 2:
1
1
1
1
)1(
1
1
1
1
)( tYtY
1
1
1
1
)1(
1
1
1
1
)( tYtY
1
1
1
1
)2(
1
1
1
1
)1(
1
1
1
1
)( tYtYtY
0311
3011
1101
1110
W
Neural Networks and Fuzzy Systems
Hopfield network as a model for associative memory
• Associative memory– Associates different features with each other
• Karen green
• George red
• Paul blue
– Recall with partial cues
Neural Networks and Fuzzy Systems
Neural Network Model of associative memory
• Neurons are arranged like a grid:
Neural Networks and Fuzzy Systems
Setting the weights
• Each pattern can be denoted by a vector of -1s or 1s:
• If the number of patterns is m then:
• Hebbian Learning:– The neurons that fire together , wire together
pN
pppp ssssS ,...,,,1,1,1....,1,1,1,1 321
m
p
pj
piij ssw
1
Neural Networks and Fuzzy Systems
Learning in Hopfield net
Neural Networks and Fuzzy Systems
Summary
• Associative memory
• Discrete Hopfield Neural Networks
• Hebbian Learning Rule
Readings
Picton ’s book:
Haykin’s book: pp.289-308
Blackboad readings