synaptic dynamics: unsupervised learning

47
22/5/16 Synaptic Dynamics: Synaptic Dynamics: Unsupervised Unsupervised Learning Learning Part Part Wang Xiumei Wang Xiumei

Upload: felice

Post on 22-Feb-2016

57 views

Category:

Documents


0 download

DESCRIPTION

Synaptic Dynamics: Unsupervised Learning. Part Ⅱ Wang Xiumei. 1. Stochastic unsupervised learning and stochastic equilibrium; 2. Signal Hebbian Learning; 3. Competitive Learning. 1.Stochastic unsupervised learning and stochastic equilibrium. ⑴ The noisy random unsupervised - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Synaptic Dynamics: Unsupervised Learning

23/4/22

Synaptic Dynamics:Synaptic Dynamics:Unsupervised Unsupervised

LearningLearningPart ⅡPart Ⅱ

Wang XiumeiWang Xiumei

Page 2: Synaptic Dynamics: Unsupervised Learning

23/4/22

1.Stochastic unsupervised learning and stochastic equilibrium;2.Signal Hebbian Learning;

3.Competitive Learning.

Page 3: Synaptic Dynamics: Unsupervised Learning

23/4/22

1.Stochastic unsupervised learning and stochastic equilibrium

⑴ The noisy random unsupervised learning law;

⑵ Stochastic equilibrium; ⑶ The random competitive learning law; ⑷ The learning vector quantization

system.

Page 4: Synaptic Dynamics: Unsupervised Learning

23/4/22

The noisy random unsupervised learning

lawThe random-signal Hebbian learning law: (4-92) denotes a Browian-motion diffusion process, each term in (4-92)demotes a separate random process.

( ) ( )ij ij i i i i ijdm m dt S x S y dt dB

{ ( )}ijB t

Page 5: Synaptic Dynamics: Unsupervised Learning

23/4/22

The noisy random unsupervised learning

law• Using noise relationship: we can rewrite (4-92):

(4-93)We assume the zero-mean, Gaussian white-

noise process ,and use equation :

( ) ( )ij ij i i j j ijm m S x S y n

dB ndt

{ ( )}ijn t

( , , ) ( ) ( )ij ij i i j jf x y M m S x S y

Page 6: Synaptic Dynamics: Unsupervised Learning

23/4/22

The noisy random unsupervised learning

lawWe can get a noisy random unsupervised

learning law (4-94)Lemma: (4-95)

is finite variance. proof: P132

, ,ij ij ijm f X Y M n

2ij ijE m

ij

Page 7: Synaptic Dynamics: Unsupervised Learning

23/4/22

The noisy random unsupervised learning

law The lemma implies two points:1, stochastic synapses vibrate in equilibrium, and they vibrate at least as much as the driving noise process vibrates; 2,the synaptic vector changes or vibrate at every instant t, and equals a constant value. wanders in a brownian motion about the constant value E[ ].

jm

jmjm

Page 8: Synaptic Dynamics: Unsupervised Learning

23/4/22

Stochastic equilibriumWhen synaptic vector stops moving,synaptic equilibrium occurs in “steady state”, (4-101)synaptic vector reaches synaptic equilibrium when only the random noisevector change : (4-103)

jm

jm

jm

0jm

j jm njn

Page 9: Synaptic Dynamics: Unsupervised Learning

23/4/22

The random competitive learning

lawThe random competitive learning law

The random linear competitive learning law j j j j jm S y S X m n

j j j j jm S y X m n

Page 10: Synaptic Dynamics: Unsupervised Learning

23/4/22

The learning vector quantization

system.

1

1

1

j j k k j k j

j j k k j k j

i i

m k m k c X m k X D

m k m k c X m k X D

m k m k i j

Page 11: Synaptic Dynamics: Unsupervised Learning

23/4/22

The self-organizing map system

• The self-organizing map system equations:

1

1j j k k j

i i

m k m k C X m k

m k m k i j

Page 12: Synaptic Dynamics: Unsupervised Learning

23/4/22

The self-organizing map system

The self-organizing map is a unsupervisedclustering algorithm.Compared with traditional clusteringalgorithms, its centroid can be mapped a curve or plain, and it remains topologicalstructure.

Page 13: Synaptic Dynamics: Unsupervised Learning

23/4/22

2.Signal Hebbian Learning⑴ Recency effects and forgetting;⑵ Asymptotic correlation encoding;⑶ Hebbian correlation decoding.

Page 14: Synaptic Dynamics: Unsupervised Learning

23/4/22

Signal Hebbian LearningThe deterministic first-ordersignal Hebbian learning law: (4-132)

(4-133)

ij ij i i j jm m t S x t S y t

0

0tt s t

ij ij i jm t m e S s S s e ds

Page 15: Synaptic Dynamics: Unsupervised Learning

23/4/22

Recency effects and forgettingHebbian synapses learn an exponentially weighted average of sampled patterns.the forgetting term is .The simplest local unsupervised learninglaw:

ijm

ij ijm m

Page 16: Synaptic Dynamics: Unsupervised Learning

23/4/22

Asymptotic correlation encoding

The synaptic matrix of long-term memory

traces asymptotically approaches thebipolar correlation matrix :

X and Y denotes the bipolar signal vectors and .

TK KX Y

TK KM X Y

Mijm

( )S x ( )S y

Page 17: Synaptic Dynamics: Unsupervised Learning

23/4/22

Asymptotic correlation encoding

In practice we use a diagonal fading-memory exponential matrix W compensates for the inherent exponential decay of learned information:

(4-142)1

mT T

k k kk

X WY w X Y

Page 18: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian correlation decoding First we consider the bipolar correlation encoding of the M bipolar associations ,and turn bipolar associations into binary vector associations . replace -1s with 0s

,i iX Y

,i iX Y

,i iA B

,i iA B ,i iX Y

Page 19: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian correlation decoding The Hebbian encoding of the bipolar associations corresponds to the weighted Hebbian encoding scheme if the weight matrix W equals the

(4-143)1

mTi i

i

M X Y

mmI

Page 20: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian correlation decoding We use the Hebbian synaptic M for bidirectional processing of and neuronal signals, and pass neural signal through M in the forward direction, in the backward direction.

TM

XF YF

Page 21: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian correlation decodingSignal-noise decomposition:

m

T Ti i i i i j j

j i

X M X X Y X X Y

m

Ti i j j

j i

nY X X Y

ij j

j

c Y

Page 22: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian correlation decodingCorrection coefficients : (4-149) They can make each vector resemble

in sign as much as possible. The same correction property holds in the backward direction .

T Tij i j j i jic X X X X c

ij jc Y

ijc

jY

Page 23: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian correlation decoding

We define the Hamming distance between binary vectors and iA jA

1

,n

k ki j i j

k

H A A a a

Page 24: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian correlation decoding [number of common bits] - [number of different bits ]

Tij i jc X X

, ,i j i jn H A A H A A

2 ,i jn H A A

Page 25: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian correlation decoding1) Suppose binary vector is close to ,

Then ,geometrically, the two patterns are less than half their space away from each other, So . In the extreme case ;so .

2) The rare case that result in , and the correction coefficients should be discarded.

iA jA

0ijc ijc n

j iY Y , / 2i jH A A n

0ijc

Page 26: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian correlation decoding3) Suppose is far away from , . In the extreme case:

, .

iA jA0ijc

ijc n ij j j ic Y nY nY

Page 27: Synaptic Dynamics: Unsupervised Learning

23/4/22

binary vectoriA

bipolar vectoriX

sum contiguous correlation-encoded associations:

1

1 1

mT Tm i j

i

T X X X X

1 1, ,i j i jH A A H A A

1 1i i ij jj i

X T nX c X

1 1T

i i ij jj i

X T nX c X

1iX X

1iX X

Hebbian encoding method

T

Page 28: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian encoding methodExample(P144): consider the three-step limit cycle:

convert bit vectors to bipolar vectors:

1 2 3 1A A A A

1 1 0 1 0A

2 1 1 0 0A

3 1 0 0 1A

1 1 1 1 1X 2 1 1 1 1X

3 1 1 1 1X

Page 29: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian encoding methodProduce the asymmetric TAM matrix T:

1 2 2 3 3 1

1 1 1 1 1 1 1 1 1 1 1 11 1 1 1 1 1 1 1 1 1 1 11 1 1 1 1 1 1 1 1 1 1 11 1 1 1 1 1 1 1 1 1 1 1

3 1 1 11 1 1 31 3 1 11 1 3 1

T T TT X X X X X X

Page 30: Synaptic Dynamics: Unsupervised Learning

23/4/22

Hebbian encoding methodPassing the bit vectors through T in the forward direction produces:

Produce the forward limit cycle:

iA

1 22 2 2 2 1 1 0 0AT A

2 32 2 2 2 1 0 0 1A T A

3 12 2 2 2 1 0 1 0AT A

1 2 3 1A A A A

Page 31: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competitive Learning The deterministic competitive learning law: (4-165)

(4-166) We see that the competitive learning lawuses the nonlinear forgetting term: .

ij j i ijm S S m

ij j ij i jm S m S S

j ijS m

Page 32: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competitive Learning Heb learning law uses the linear forgettingterm . So the two laws differ in how they forget, not in how they learn. In both cases when -when the jth competing neuron wins-the synaptic valueencodes the forcing signal and encodes it exponentially quickly.

1jS

ijmiS

ijm

Page 33: Synaptic Dynamics: Unsupervised Learning

23/4/22

3.Competitive Learning.⑴ Competition as Indication;⑵ Competition as correlation detection;⑶ Asymptotic centroid estimation;⑷ Competitive covariance estimation.

Page 34: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competition as indication Centroid estimation requires that the competitive signal approximate theindicator function of the locally sampled pattern class : (4-168)

jS

jDjD

I

jj j DS y I x

Page 35: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competition as indication

If sample pattern X comes from region ,the jth competing neuron in should win, and all other competing neurons shouldLose. In practice we usually use the randomlinear competitive learning law and a simpleadditive model. (4-169)

jDYF

( ) ( )j

Tj j j DS xm f I x

Page 36: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competition as indication

the inhibitive-feedback term equals the additive sum of synapse-weighted signal: (4-170) if the jth neuron wins, and to if instead the kth neuron wins.j jjf s

j kjf s

1

p

j kj k kk

f s S y

Page 37: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competition as correlation detection

The metrical indicator function: (4-171)If the input vector X is closer to synaptic vector than to all other stored synapticvectors, the jth competing neuron will win.

2 2

2 2

1 min

0 min

j kkj j

j kk

if x m x mS y

if x m x m

jm

Page 38: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competition as correlation detection

Using equinorm property, we can get theequivalent equalities(P147): (4-174) (4-178)

(4-179)

2 2minT Tj kk

xm xm

2 2minj kkx m x m

maxT Tj kk

xm xm

Page 39: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competition as correlation detection

From the above equality, we can get: The jthCompeting neuron wins iff the input signalor pattern correlates maximally with .The cosine law: (4-180)

x jm

cos ,Tj j jXm X m x m

Page 40: Synaptic Dynamics: Unsupervised Learning

23/4/22

Asymptotic centroid estimationThe simpler competitive law:

(4-181)If we use the equilibrium condition: (4-182)

jj D j jm I x x m n

0 jm

jD j jI x x m n

Page 41: Synaptic Dynamics: Unsupervised Learning

23/4/22

Asymptotic centroid estimationSolving for the equilibrium synaptic vector: (4-186)It show that equals the centroid of .

0 ( ) [ ]n jD j jRI x x m p x dx E n

( )

( )j

j

Dj

D

xp x dxm

p x dx

jm jD

Page 42: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competitive covariance estimation

Centroids provides a first-order Estimateof how the unknown probability Densityfunction behaves in the regions ,and local covariances provide a second-orderdescription.

jD( )p x

Page 43: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competitive covariance estimation

Extend the competitive learning laws toasymptotically estimate the localconditional covariance matrices : (4-187) (4-189)

denotes the centriod.

T

j j j jK E x x x x D

jK

( )j

j j jDx xp x D dx E X X D

jx jD

Page 44: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competitive covariance estimation

The fundamental theorem of estimation theory [Mendel 1987]:

(4-190) is Borel-measurable random vector function

T

T

E y E y x y E y x

E y f x y f x

( )f x

Page 45: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competitive covariance estimation

At each iteration we estimate the unknown centroidas the current synaptic vector ,In this sensebecomes an error conditional covariance matrix . the stochastic difference-equation algorithm:

(4-191-192)

jx

1j j k k jm k m k c x m k

1T

j j k k j k j jK k K k d x m k x m k K k

jmjK

Page 46: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competitive covariance estimation

denotes an appropriately decreasing sequence oflearning coefficients in(4-192). If the ith neuron loses the metricalcompetition

kd

1j jm k m k

1j jK k K k

YF

Page 47: Synaptic Dynamics: Unsupervised Learning

23/4/22

Competitive covariance estimation

The algorithm(4-192) corresponds to thestochastic differential equation:

(4-195)

(4-199)

0 [( ) ( ) ]j

Tj D j j j jK I x x x x x K N

j

T

j D j j j jK I x m x m K N

( ) ( ) ( )

( )j

j

Tj jD

j

D

x x x x p x dxK

p x dx