professor john hopfield - poseidon.csd.auth.gr

43
Professor John Hopfield The Howard A. Prior Professor of Molecular Biology Dept. of Molecular Biology Computational Neurobiology; Biophysics Princeton University

Upload: others

Post on 27-Jun-2022

9 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Professor John Hopfield - poseidon.csd.auth.gr

Professor John Hopfield The Howard A. Prior Professor of Molecular BiologyDept. of Molecular Biology Computational Neurobiology; Biophysics Princeton University

Page 2: Professor John Hopfield - poseidon.csd.auth.gr

The physicist Hopfield showed that models of physical systems could be used to solve computational problems

Such systems could be implemented in hardware by combining standard components such as capacitors and resistors.

Page 3: Professor John Hopfield - poseidon.csd.auth.gr

The importance of the Hopfield nets in practical application is limited due to theoretical limitations of the structure,but, in some cases, they may form interesting models.

Usually employed in binary-logic tasks : e.g. pattern completion and association

Page 4: Professor John Hopfield - poseidon.csd.auth.gr

In the beginning of 80s Hopfield published two scientific papers, which attracted much interest.

(1982): ‘’Neural networks and physical systems with emergent collective computational abilities’’.Proceedings of the National Academy of Sciences, pp. 2554-2558.

(1984): ‘’Neurons with graded response have collective computational properties like those of two-state neurons’’.Proceedings of the National Academy of Sciences, pp. 81:3088-3092

This was the starting point of the new era of neural networks, which continues today

Page 5: Professor John Hopfield - poseidon.csd.auth.gr

‘‘The dynamics of brain computation”

The question :How is one to understand the incredible effectiveness of a brainin tasks such as recognizing a particular face in a complex scene?

-Like all computers, a brain is a dynamical system that carries out its computations by the change of its 'state' with time.

Simple models of the dynamics of neural circuits are described that have collective dynamical properties. These can be exploited in recognizing sensory patterns.

Using these collective properties in processing information is effective in that it exploits the spontaneous properties of nerve cells and circuits to produce robust computation.

Page 6: Professor John Hopfield - poseidon.csd.auth.gr

J. Hopfield’s quest While the brain is totally unlike modern computers, much of what it does can be described as computation.

Associative memory, logic and inference, recognizing an odor or a chess position, parsing the world into objects, and generating appropriate sequences of locomotor muscle commands are all describable as computation.

His research focuses on understanding how the neural circuits of the brain produce such powerful and complex computations.

Page 7: Professor John Hopfield - poseidon.csd.auth.gr

Olfaction

The simplest problem in olfaction is simply identifying a known odor.

However, olfaction allows remote sensing, and much more complex computations involving wind direction and fluctuating mixtures of odors must be described to account for the ability of homing pigeons or slugs to navigate through the use of odors.

We have been studying how such computations might be performed by the known neural circuitry of the olfactory bulb and prepiriformcortex of mammals or the analogous circuits of simpler animals.

Page 8: Professor John Hopfield - poseidon.csd.auth.gr

Dynamical systems

Any computer does its computation by its changes in internal state.

In neurobiology, the change of the potentials of neurons (and changes in the strengths of the synapses) with time is what performs the computations.

Systems of differential equations can represent these aspects of neurobiology.

He seeks to understand some aspects of neurobiological computation through studying the behavior of equations modeling the time-evolution of neural activity.

Page 9: Professor John Hopfield - poseidon.csd.auth.gr

Action potential computationFor much of neurobiology, information is represented by the paradigm of ‘‘firing rates’’, i.e. information is represented by the rate of generation of action potential spikes, and the exact timing of these spikes is unimportant.

There are cases, for example the binaural auditory determination of the location of a sound source, where information is encoded in the timing of action potentials.

Since action potentials last only about a millisecond, the use of action potential timing seems a powerful potential means of neural computation.

Page 10: Professor John Hopfield - poseidon.csd.auth.gr

Speech

Identifying words in natural speech is a difficult computational task which brains can easily do.

They use this task as a test-bed for thinking about the computational abilities of neural networks and neuromorphic ideas

Page 11: Professor John Hopfield - poseidon.csd.auth.gr

Simple (e.g. binary-logic ) neurons are coupled in a system with recurrent signal flow

A 2-neurons Hopfield network of continuous states characterized by 2 stable states

Page 12: Professor John Hopfield - poseidon.csd.auth.gr

A 3-neurons Hopfield network of 23=8 states characterized by 2 stable states

Page 13: Professor John Hopfield - poseidon.csd.auth.gr

Wij = Wji

The behavior of such a dynamical system is fully determined by the synaptic weights

And can be thought of as an Energy minimization process

Page 14: Professor John Hopfield - poseidon.csd.auth.gr

Hopfield Nets are fully connected, symmetrically-weighted networks that extended the ideas of linear associative memoriesby adding cyclic connections .

Note: no self-feedback !

Page 15: Professor John Hopfield - poseidon.csd.auth.gr

Operation of the networkAfter the ‘teaching-stage’, in which the weights are defined, the initial state of the network is set (input pattern) and a simple recurrent rule is iterated till convergence to a stable state (output pattern)

There are two main modes of operation:Synchronous vs. Asynchronous updating

Regarding training a Hopfield netas a content-addressable memory

_ the outer-product rule for storing patterns is used

Page 16: Professor John Hopfield - poseidon.csd.auth.gr

Hebbian Learning

Probe pattern

Dynamical evolution

Page 17: Professor John Hopfield - poseidon.csd.auth.gr

A Simple Example

Step_1. Design a networkwith memorized patterns (vectors) [ 1, -1, 1 ] & [ -1, 1, -1 ]

Page 18: Professor John Hopfield - poseidon.csd.auth.gr

There are 8 different states that can be reached by the net and therefore can be used as its initial state

Step_2. Initialization

#1: y1#2: y2#3: y3

Page 19: Professor John Hopfield - poseidon.csd.auth.gr

3 different examples of the net’s flow

(converges immediately)

Schematic diagram of all the dynamical trajectories that correspond to the designed net.

Stored pattern

Step_3. Iterate till convergence- Synchronous Updating -

Page 20: Professor John Hopfield - poseidon.csd.auth.gr

OrStep_3. Iterate till convergence- Asynchronous Updating -

Each time, select one neuron at random and updated its state with the previous rule

and the –usual- convention that if the total input to that neuron is 0 its state remains unchanged

Page 21: Professor John Hopfield - poseidon.csd.auth.gr

Explanation of the convergenceThere is an energy function related with each state of the Hopfield network

E( [y1, y2, …, yn]T ) = -Σ Σ wij yi yjwhere [y1, y2, …, yn]T is the vector of neurons’ output, wij is the weight from neuron j to neuron i, and the double sum is over i and j.

The corresponding dynamical system evolves

toward states of lower Energy

Page 22: Professor John Hopfield - poseidon.csd.auth.gr

States of lowest energy correspond to attractors of Hopfield-net dynamics

Page 23: Professor John Hopfield - poseidon.csd.auth.gr

Capacity of the Hopfield memoryIn short, while training the net (via the outer-product rule) we’re storing patterns by posing different attractors in the state-space of the system.

While operating, the net searches the closest attractor.

When this is found, the corresponding pattern of activation is outputted

Question :How many patterns we can store in a Hopfield-net ?

0.15 N, N: # neurons

Page 24: Professor John Hopfield - poseidon.csd.auth.gr

A simple Pattern Recognition Example

Computer Experimentation

Class-project

Page 25: Professor John Hopfield - poseidon.csd.auth.gr

Stored Patterns (binary images)

Page 26: Professor John Hopfield - poseidon.csd.auth.gr

Perfect Recall-Image Restoration

Erroneous Recall

Page 27: Professor John Hopfield - poseidon.csd.auth.gr

Irrelevant results

Note:explain the ‘negatives’ ….

Page 28: Professor John Hopfield - poseidon.csd.auth.gr
Page 29: Professor John Hopfield - poseidon.csd.auth.gr

The continuous Hopfield-Net as optimization machinery

[ Tank and Hopfield ; IEEE Trans. Circuits Syst. 1986; 33: 533-541.]:‘Simple "Neural" Optimization Networks: An A/D Converter, SignalDecision Circuit, and a Linear Programming Circuit’.

Hopfield modified his network so as to work with continuous activation and

-by adopting a dynamical-systems approach-

showed that the resulting system is characterized by a Lyaponov-functionwho termed it ‘Computational-Energy’ & which can be used to tailor the net for specific optimizations

Page 30: Professor John Hopfield - poseidon.csd.auth.gr

The system of coupled differential equation describing the operation of continuous Hopfield net

dudt

un

T g u Ii i

iij j j

j

ni= − + ∑ +

=( )

1

( )g u gain u( ) tanh( )= + ⋅12

1

E T g u g u Ii

nij

j

ni i j j i i i

i

n= − ∑ ∑ − ∑

= = =

12 1 1 1

( ) ( ) ( )g u Tij=Tji και Tij=0

E T V V I Vi

nij

j

ni j i i

i

n= − ∑ ∑ − ∑

= = =

12 1 1 1

The Computational Energy

Page 31: Professor John Hopfield - poseidon.csd.auth.gr

When Hopfield nets are used for function optimization, the objective function F to be minimized is written asenergy function in the form of computational energy E .

The comparison between E and Fleads to the design,i.e. definition of links and biases, of the network that can solve the problem.

Page 32: Professor John Hopfield - poseidon.csd.auth.gr

The actual advantage of doing this is that the Hopfield-net

has a direct hardware implementation

that enables even a VLSI-integration of the algorithm performing the optimization task

Page 33: Professor John Hopfield - poseidon.csd.auth.gr

An example: ‘Dominant-Mode Clustering’

Given a set of N vectors {Xi} define the k among them that form the most compact cluster {Zi}

2ji X-X ji

N

j=1

N

=1ii uu=)}u({F ∑∑

⎪⎩

⎪⎨⎧

∈=∑

}{if

}{if: k = u with i

N

i ii

iiii

Z X 0

Z X 1}u{}u{

The objective function F can be written easily in the form of computational energy E

Page 34: Professor John Hopfield - poseidon.csd.auth.gr

ji

N

j=1

N

=1ii uu=)}u({F 2

ji X-X ∑∑

0=I jiif

= j)D(i, -2=T=T

VI - VVT21- =F

objijiij

ii

N

=1ijiij

N

j=1

N

=1i

∧⎪⎩

⎪⎨

=

∑∑∑

2ji2

0

XX

With each pattern Xi we associate a neuron in the Hopfield network ( i.e. #neurons=N ).

The synaptic weights are the pairwise-distances (*2)

If its activation is ‘1’ when the net will converge the corresponding pattern will be included in the cluster.

Page 35: Professor John Hopfield - poseidon.csd.auth.gr

A classical example: ‘The TravellingSalesman Problem’

Page 36: Professor John Hopfield - poseidon.csd.auth.gr

The principle

Coding a possible route as a combination of neurons’ firings

5 3 4 1 2 5|5-3|+|3-4|+|4-1|+|1-2|+|2-5|

Page 37: Professor John Hopfield - poseidon.csd.auth.gr

An example from clinical Encephalography

The problem : The idea :

Page 38: Professor John Hopfield - poseidon.csd.auth.gr

‘‘‘‘Hopfield Neural Nets Hopfield Neural Nets for monitoring Evoked Potential Signalsfor monitoring Evoked Potential Signals’’’’

[ Electroenc. Clin. Neuroph. 1997;104(2) ]

The solution :

N. Laskaris et al.

Page 39: Professor John Hopfield - poseidon.csd.auth.gr

The BoltzmannMachine

Improving Hopfield nets by simulating annealingand adopting more complex topologies

Page 40: Professor John Hopfield - poseidon.csd.auth.gr

(430 – 355) π.X.

‘Ας κλείσω λοιπόν εδώ . . . .. . . . . . . . . . . . . .

. . . . κάποιος άλλος, ίσως θα συµπληρώσει όσα δεν µπόρεσα να ολοκληρώσω’

- Θεµιστογένης ο Συρακούσιος 1ο έτος της 105ης Ολυµπιάδας

ΕΛΛΗΝΙΚΑ

Page 41: Professor John Hopfield - poseidon.csd.auth.gr

(1979-1982)

Hopfield-netsPNAS

(1982)

Page 43: Professor John Hopfield - poseidon.csd.auth.gr

‘‘ Τα παιδιά στην Κερκίδα είναι η µόνη σου Ελπίδα ....’’