neural network hopfield model kim, il joong. contents neural network: introduction definition...

Post on 30-Dec-2015

228 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Neural Network

Hopfield model

Kim, Il Joong

Contents

1. Neural network: Introduction① Definition & Application② Network architectures③ Learning processes (Training)

2. Hopfield model① Summary of model② Example③ Limitations

3. Hopfield pattern recognition on a scale-free neural network

Definition of Neural Network A massively parallel system made up of simple process

ing units and dense interconnections, which has a natural propensity for storing experien-tial knowledge and making it available for use.

Interconnection strengths, known as synaptic weights, are used to store the acquired knowledge.

=> Learning process.

Application of Neural Network Patterns-pattern mapping, pattern

completion, pattern classification

Image Analysis Speech Analysis & Generation Financial Analysis Diagnosis Automated Control

Network architectures Single-layer feedforward network

Network architectures Multilayer feedforward network

Network architectures Recurrent network

Learning processes (training)

Error-correction learning Memory-based learning Hebbian learning Competitive learning Boltzmann learning

Hebbian learning process If two neurons on either side of a synapse connection are

activated simultaneously, then the strength of that synapse is increased. If two neurons on either side of a synapse are activated

asynchronously, then the strength of that synapse is weakened or

eliminated.

Hopfield model

N processing units (binary) Fully(Infinitely) connected : N(N-1) connections Single-layer(no hidden layer) Recurrent(feedback) network : No self-feedback loof

Network architecture

Hopfield model Learning process

Let denote a known set of N-dim. memories.

M ,,,, 321

)(1

1

MN

WM

T

Hopfield model Inputting and updating

Let denote an unknown N-dimensional input vector.probe

Update asynchronously (i.e., randomly and one at a time) according to the rule

Hopfield model Convergence and Outputting

Repeat updating until the state vector remains unchanged. Let denote the fixed point (stable state).

fixedX

fixedXY Associated memories

Memory vectors are states that corresponds to minimum E.

Any input vector converges to the stored memory vector that is most similar or most accessible to the input.

j

jii

ijji xxE 2

1

ji

iijijjjj xxnEnEE

2

1)()1(

M ,,,, 321

Hopfield model N=3 example

Let (1,-1,1), (-1,1,-1) denote the stored memories. (M=2)

022

202

220

3

1W

Limitations of Hopfield model① The stored memories are not always stable.

② There may be stable states that were not the stored memories. (Spurious states)

The signal-to-noise ratio:

for large M.

The quality of memory recall breaks down at M=0.14N

M

N

Limitations of Hopfield model③ Stable state may not be the state that is most

similar to the input state.

On a scale-free neural network Network architecture: the BA scale-free network

A small core of m nodes. (fully connected) N (≫m) nodes are added.

Total N + m processing units. Total Nm connections. (for 1≪m≪N)

On a scale-free neural network Hopfield pattern recognition

Stored P different patterns: Input pattern: 10% reversal of ( =0.8) Output pattern: The quality of recognition: overlap

),,2,1( Pi

1i

iS

i

iiSN11

On a scale-free neural network Small m : N=10000, m=2,3,5

On a scale-free neural network Large m : N+m=10000, P=10,100,1000

On a scale-free neural network Comparison with a fully connected network (m=N)

For small m, low quality of recognition. For 1≪m≪N, good quality of recognition. Gain a factor N/m>>1 in the computer memory and time. A gradual decrease of quality of recognition.

References

D. Stauffer et al., http://xxx.lanl.gov/abs/cond-mat/0212601 (2002)

A. S. Mikhailov, Foundations of Synergetics 1, Springer-Verlag Berlin Heidelberg (1990)

John Hertz et al., Introduction to the theory of neural computation, Addison-Wesley (1991)

Judith E. Dayhoff, Neural Network Architectures, Van Nostrand Reinhold (1990)

S. Haykin, Neural Networks, Prentice-Hall (1999)

top related