[studies in computational intelligence] trends in neural computation volume 35 || a...

23
Chapter 5 A HIPPOCAMPUS-NEOCORTEX MODEL FOR CHAOTIC ASSOCIATION Takashi Kuremoto, 1 Tsuyoshi Eto, 2 Kunikazu Kobayashi 1 and Masanao Obayashi 1 1 Yamaguchi Univ., Tokiwadai 2-16-1, Ube, Japan {wu, koba, m.obayas}@yamaguchi-u.ac.jp 2 CEC-OITA Ltd., Kumano Ohira 21-1, Kitsuki, Japan Abstract To realize mutual association function, we propose a hippocampus- neocortex model with a multi-layered chaotic neural network (MCNN ). The model is based on Ito et al.’s hippocampus-cortex model, which is able to recall temporal patterns, and form long-term memory. The MCNN consists of plural chaotic neural networks (CNN s), whose each CNN layer is a classical association model proposed by Aihara et al.. MCNN realizes mutual association using incremental and relational learning between layers, and it is introduced into CA3 of hippocampus. This chaotic hippocampus-neocortex model intends to retrieve relative multiple time series patterns which are stored (experienced) before when one common pattern is represented. Computer simulations verified the efficiency of proposed model. Keywords: chaotic neural network, associative memory, time-series pattern, mu- tual association, hippocampus, neocortex, long-term memory, episode memory, consolidation 1. Introduction The experimental studies on physiological and anatomical suggest that memory functions of brain are executed in neocortex and hippocampus This work was partially supported by Grants-in-Aids for Scientific Research for the Encour- agement of Young Scientists (B) (No.15700161) from the Ministry of Education, Culture, Sports, Science, and Technology, Japan T. Kuremoto et al.: A Hippocampus-Neocortex Model for Chaotic Association, Studies in www.springerlink.com c Computational Intelligence (SCI) 35, 111–133 (2007) Springer-Verlag Berlin Heidelberg 2007

Upload: lipo

Post on 23-Dec-2016

225 views

Category:

Documents


9 download

TRANSCRIPT

Page 1: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

Chapter 5

A HIPPOCAMPUS-NEOCORTEX MODELFOR CHAOTIC ASSOCIATION

Takashi Kuremoto,1 Tsuyoshi Eto,2 Kunikazu Kobayashi1 and MasanaoObayashi11Yamaguchi Univ., Tokiwadai 2-16-1, Ube, Japan∗

{wu, koba, m.obayas}@yamaguchi-u.ac.jp

2CEC-OITA Ltd., Kumano Ohira 21-1, Kitsuki, Japan

Abstract To realize mutual association function, we propose a hippocampus-neocortex model with a multi-layered chaotic neural network (MCNN).The model is based on Ito et al.’s hippocampus-cortex model, whichis able to recall temporal patterns, and form long-term memory. TheMCNN consists of plural chaotic neural networks (CNNs), whose eachCNN layer is a classical association model proposed by Aihara et al..MCNN realizes mutual association using incremental and relationallearning between layers, and it is introduced into CA3 of hippocampus.This chaotic hippocampus-neocortex model intends to retrieve relativemultiple time series patterns which are stored (experienced) before whenone common pattern is represented. Computer simulations verified theefficiency of proposed model.

Keywords: chaotic neural network, associative memory, time-series pattern, mu-tual association, hippocampus, neocortex, long-term memory, episodememory, consolidation

1. IntroductionThe experimental studies on physiological and anatomical suggest that

memory functions of brain are executed in neocortex and hippocampus∗This work was partially supported by Grants-in-Aids for Scientific Research for the Encour-agement of Young Scientists (B) (No.15700161) from the Ministry of Education, Culture,Sports, Science, and Technology, Japan

T. Kuremoto et al.: A Hippocampus-Neocortex Model for Chaotic Association, Studies in

www.springerlink.com cComputational Intelligence (SCI) 35, 111–133 (2007)

© Springer-Verlag Berlin Heidelberg 2007

Page 2: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

112 T. Kuremoto et al.

[30, 9, 7, 37]. Although the mechanism of learning and memory is notunderstood completely, the process of memorization can be consideredroughly as: sensory receptor → sensory memory (in primary cortex)→ short-term memory (in neocortex) → intermediate-term memory (ina dialogue between the hippocampus and the neocortex) → long-termmemory (in neocortex) [40, 30, 7, 37, 19]. Based on the knowledge offacts in nature, Ito et al. proposed a hippocampus-neocortex model forepisodic memory [18, 20], and a hippocampus-cortex model for long-termmemory [19]. Meanwhile, as chaotic phenomena are observed in neuronsactivity, there have been many chaotic neural networks were proposedfor decades [8, 38, 3, 2–1, 41, 42, 35, 36, 26, 43, 23]. For chaotic mem-ory systems, especially, there also exit chaotic neural networks (CNN)given by Aihara and his fellows [3, 2], chaotic memory map given byTsuda [42], transient-associative network (TCAN) given by Lee [26],advanced Aihara’s models and their applications [35, 36, 23], and soon. These chaotic models provide auto-associative function, recall inputpatterns as short-term memory.

Though all facts of neocortex, hippocampus and the communicationbetween them are understood poorly, recent researches show the impor-tant role of hippocampus in the formation of long-term memory in neo-cortex [7, 37]. Some neural networks of hippcampal functions have beendeveloped as memory systems [4, 32, 33]. Here, we especially assumethere is a chaotic circuit in CA3 of hippocampus, and improve Ito et al.’smodel [19] using a multi-layered chaotic neural network (MCNN) [23].The new chaotic model provides one-to-many retrieval of time-series pat-terns by its incremental and relational learning between chaotic neuralnetwork (CNN) layers [24]. The chaotic hippocampus-neocortex modelis designed to realize mutual association of time series patterns, andform long term memory, which functions exist in the humans brain butthe mechanism is not understood completely.

2. Classical ModelsIn this section, a hippocampus-cortex model proposed by Ito et al.

[19] is introduced at first, then a chaotic neural network (CNN) proposedby Aihara et al. [3, 2] is described, and a multi-layered chaotic neuralnetwork (MCNN) proposed by us [23] is addressed at last.

Model of Ito et al.

The original hippocampus-cortex model of Ito et al. is presented byFig. 5.1 [19]. The signal flow of the system is: input patterns (Inputlayer) → sensory memory (Cortex 1) → short-term memory (Cortex

Page 3: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 113

Input

Cortex 1

Cortex 2

DG

CA3

CA1

Association Cortex Hippocampus

Figure 5.1. Structure of hippocampus-cortex model proposed by Ito et al. (2000).

2) and intermediate-term memory (DG) → Hebbian learning (CA3) →decoding (CA1) → long-term memory (Cortex 2). The long-term mem-ory are stored in Cortex 2 at last, and as output of system, the storedtemporal patterns are recalled when one of the patterns is represent asinput.

We repeated computer simulation of this model and obtained the sameresults as [19]. When we presented an input pattern which was storedin two different time-series patterns, however, the system failed to re-trieve two temporal patterns correctly. The reason could be consideredthat energy of internal state function dropped at a convergence pointcorresponding to the input pattern.

Meanwhile, there are many other remarkable approaches of theoret-ical studies for associative memory [3, 2, 26]. Classical chaotic neuralmodels are able to retrieve stored time-series patterns by external stimu-lus. However, the retrieval is a dynamical short-term memory. Consider-ing the ability of exchanging short-term memory into long-term memoryfunction of hippocampus [30, 9, 7, 37], here we introduce a multi-layeredchaotic neural network (MCNN) [23] into conventional hippocampus-cortex model to realize mutual association of different time-series pat-terns (one-to-many retrieval). The new chaotic hippocampus-neocortex

Page 4: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

114 T. Kuremoto et al.

model is expected to form long-term memory in neocortex, and realizemutual association during retrieving.

CNNOne of popular conventional chaotic neural network (CNN) for auto-

associative memory is given by Aihara and Adachi [3, 2]. The networkis a kind of interconnected neural network, i.e., each neuron connects toothers and itself. Different from classic associatron [34], static associa-tion of Hopfield model [14, 15], CNN has a recurrent structure, and fora constant external stimulation, it gives chaotic response by adjustingthe parameter values. CNN model can be simply given by:

xi(t + 1) = f (yi(t + 1) + zi(t + 1)) (5.1)yi(t + 1) = kryi(t) − αxi(t) + ai (5.2)

zi(t + 1) = kfzi(t) +n∑

j=1

wijxj(t) (5.3)

where xi(t): output value of ith neuron at time t, n: number of input,wij : connection weight from jth neuron to ith neuron, yi(t): internalstate of ith neuron as to factory, zi(t): internal state of ith neuronas to reciprocal action, α: threshold of ith neuron, kf , kr: dampingcoefficient (when values of them equal zero, network transforms intoHopfield model), a: item which value is given by summation of thresholdand external input. The connection weight wij is define as:

wij =1m

m∑

p=1

xpi x

pj (5.4)

where, xpi : ith element of pth stored pattern, m: number of stored

patterns.The input-output function of Eq. 5.1 is a sigmoid curve given by Eq.

5.5.

f(x) =1

1 + exp (−x/ε)(5.5)

where, ε is a steepness parameter.Using 100 chaotic neurons (i = 1, 2, ..., 100, in Eq. 5.3 n = 100), and

4 stored patterns (in Eq. 5.4, m = 4) which are similar to the first timeseries patterns in Fig. 5.5, Adachi and Aihara [2] simulated dynamical

Page 5: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 115

Wij12

Wij21

CNN 1 Layer

CNN 2 LayerCNN 2

Lay

er

yer

Output Layer

(a) (b)

Figure 5.2. Proposal structure of CA3 layer model: (a) Multi-layered chaotic neuralnetwork (MCNN); (b) Connections between MCNN layers.

association of their CNN. In the numerical simulation experiment, whenα = 10.0, ai = 2.0, kf = 0.2, kr = 0, 9, the network showed “non-periodic associative dynamics”, i.e., to an unstored pattern which wasused as an external stimulation, the network responded with a sequenceof spatio-temporal output patterns which including stored patterns andtheir transient patterns.

MCNN

For real neurons active far more complicatedly than artificial neuronsconstructed with simple threshold elements, chaotic neural models areproposed also [3, 2]. To realize mutual association function, for instance,the formation of conditional reflex (Ivan Pavlov), we proposed to com-bine multiple classical CNN layers as an associative model MCNN(Fig. 5.2).

In MCNN , neurons on each CNN layer and between the layers con-nect each other completely, and the dynamics is as follows:

xi(t + 1) = f (yi(t + 1) + zi(t + 1) + γ · vi(t + 1)) (5.6)yi(t + 1) = kryi(t) − αxi(t) + ai (5.7)

zi(t + 1) = kfzi(t) +n∑

j=1

wijxj(t) (5.8)

vi(t + 1) = kevi(t) +n∑

j=1

W ∗ij

´x(t) (5.9)

Page 6: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

116 T. Kuremoto et al.

where xi(t): output value of ith neuron at time t, n: number of input,wij : connection weight from jth neuron to ith neuron, yi(t): internalstate of ith neuron as to factory, zi(t): internal state of ith neuron as toreciprocal action, vi(t): internal state of ith neuron as to reciprocal ac-tion from another layer, α: threshold of ith neuron, kf , kr, ke: dampingrate, ai: item given by the summation of threshold and external input,γ: the rate of effectiveness from another layer, W ∗

ij : connection weightfrom jth neuron of another layer to ith neuron, x∗

j (t): output value ofjth neuron of another layer at time t. The connection weight wij isdefine as:

wij =1m

m∑

p=1

(2xpi − 1)(2xp

j − 1) (5.10)

where, xpi : ith element of pth stored pattern(0 or 1), m: number of

stored patterns. The input-output function is as follows:

f(x) =1 − exp (−x/ε)1 + exp (−x/ε)

(5.11)

where, ε is a constant.When a new pattern is input to MCNN , an additive storage is exe-

cuted on each CNN layer through ai (i = 1, ..., n). After states of thesystem store the pattern, Hebb learning, ∆wij , is executed as:

∆wij =1m

xixj (5.12)

here, m is a number of the stored patterns.The connection weights, W 12

ij and W 21ij relate patterns between what

stored in different layers of MCNN . Using relational Hebbian learning,a 2-layer MCNN , for example, stores the time-series patterns as:

∆W 12ij = β · x1

i x2j , ∆W 21

ij = β · x2i x

1j (5.13)

where, β is the learning rate, x1i is output value of ith neuron of CNN1,

x2i is output value of ith neuron of CNN2.The input-output function of Eq. 5.6 is also a sigmoid curve given by

Eq. 5.14.

f(x) =1 − exp (−x/ε)1 + exp (−x/ε)

(5.14)

where, ε is a steepness parameter.We use 100 such neurons to construct one CNN layer in simulation

experiment later.

Page 7: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 117

Pattern Pattern

Pattern Pattern

Pattern Pattern

A B

C B

C D

Arrow: Direction of relation learning

Pattern

Pattern

A

D

Arrow: flow of retrieval

Pattern

B

Pattern

C

CNN 1 CNN 2 CNN 1 CNN 2

Figure 5.3. Flow of learning (left) and retrieving (right).

Learning (storage) Algorithm. When different time series pat-terns are stored in MCNN, the network is expected to recall them auto-matically and mutually when one of patterns are presented (Fig. 5.3).In the case that a new pattern is stored to one CNN layer, an additivestored pattern is input to the CNN layer as external input through ai

(i = 1, ..., n). After states of the system kept the stored pattern, Hebbianlearning, ∆wij , is executed as:

∆wij =1m

xixj (5.15)

here, m is number of the stored patterns.The connection weights, W 12

ij and W 21ij relate patterns between differ-

ent layers of MCNN. Using next relational Hebbian learning, a 2-layerMCNN, for example, stores the time-series patterns by:

∆W 12ij = β · x1

i x2j (5.16)

∆W 21ij = β · x2

i x1j (5.17)

where, β is the learning rate, x1i is output value of ith neuron of CNN1,

x2i is output value of ith neuron of CNN2.

Control Algorithm. When a pattern is recalled by a CNN layerin MCNN, it acts as external stimulations to other layers. So to execute

Page 8: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

118 T. Kuremoto et al.

x=

n

i=0(x(t+1)-x(t))

t

To chaotic state To non-chaotic state

i i

Internal state of a CNN layer

Internal state of a CNN layer

Dx <= qDx > qS

DD

Figure 5.4. Network state control algorithm.

relational learning of them, the retrieval of the layer must be kept inconstancy. Using exponential feedback on the system parameters, Mizu-tani et al. controlled chaos in CNN [31]. A discussion of transient chaosis also given by Yang and Yuan [45]. However, considering “the state ofthe network never changes once the network retrieves a stored patterncorresponding to a stable equilibrium point of the network dynamics”[3, 2], we can simply control each CNN layer’s state with an algorithmshowed in Fig. 5.4. The state of a CNN layer is calculated by ∆x(t),total change of internal state x(t) temporally, and when ∆x(t) <= θ wechange values of parameters kr, kf into zero to stop chaos, i.e., chaoticassociation dynamics. After relative learning of other CNN layers, wereturn the network to chaotic states by increasing values of parameterskr, kf again. Influence of parameter deviations in CNN is discussed indetail by Adachi recently [1].

Mutual Association Simulation of MCNN. Fig. 5.5 shows 2time series patterns using in our computer simulation. The first timeseries patterns for CNN1 are as same as those used in association sim-ulation of Adachi and Aihara [2]. Another time series patterns are or-thogonal ones for CNN2. The situation of association mechanism issupposed similar to we remember the voices and forms of English lettersA, B, C, D, ..., Z, while also remember what of a, b, c, ..., z. If we hearpronunciation of one letter, then figure of a great letter or a small letter

Page 9: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 119

Time series of memory patterns for CNN1:

Time series of memory patterns for CNN2:

Figure 5.5. Time series patterns for mutual associative memory.

may be reminded naturally, and next letter (great or small) may recol-lect also. Two time series patterns in Fig. 5.5 are inputed to MCNN inexperiment, according to storage order showed in Fig. 5.6. The memo-rization order was: wave (CNN1) → star (CNN2) → triangle (CNN1)→ X (CNN2) → ... (Fig. 5.6). In control algorithm (Fig. 5.4), thresh-old θ = 0.0, number of steps in chaotic states and non-chaotic statesis 7, 13, respectively. The number of chaotic neurons on each CNNlayer is 100. Parameters of MCNN were used as: kf = 0.2, Ke = 0.2,kr = 0.0, 0.9, 0.1 (in different modes to control internal state of network,when kr = 0.0, a CNN layer had a stable equilibrium), ai = 0.6 (initialvalue, and i = 1, 2, ..., 100), threshold of each neuron α = 10.0. Learningrate β = 0.1.

The recalling result is shown in Fig. 5.7. In fact, on TIME = 1,5, 8, 11, ..., relational learning was executed on CNN1 (wave), CNN2(star), CNN1 (triangle), CNN2 (X), ..., respectively. After 21 timeslearning, MCNN stored all of 2 time-series patterns. On TIME = 22,input pattern wave is presented to CNN1, then on TIME = 23, 31,33, ..., the appeared pattern was star (CNN2), triangle (CNN1), X(CNN2), ..., respectively. At last, all stored patterns (Fig. 5.5) weremutually recalled after 40 steps in order as shown in Fig. 5.7.

This result states that the MCNN retrieved many stored patterns byonly one hold pattern presented at first. Simulation confirmed efficiencyof mutual association ability of proposed MCNN for plural time-seriesstored patterns.

Page 10: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

120 T. Kuremoto et al.

Figure 5.6. Input order of time series patterns for MCNN.

3. A Chaotic Model of Hippocampus-NeocortexHippocampus is considered availably an exchange organ between

short-term memory and long-term memory [7, 37]. Long term potentia-tion (LTP ), phenomena observed in CA3 layer of hippocampusespecially, maybe give the key of long-term memory formation. Here,we propose a chaotic model of hippocampus-neocortex by introducingMCNN into CA3 of Ito et al. model. Fig. 5.8 shows its structure.Neurons on each layer of MCNN accept signals from DG, then pro-vide output of sparse representation from its Output layer to CA1. Thedynamics of this model will be described in this section at first, thenconsolidation and mutual association simulation is performed to confirmthe performance of the proposed model.

Association CortexThe dynamics of association cortex (Left of Fig. 5.8) is described as

same as Ito et al. model [19]:

Ii(t) ={

1 · · · excitatory0 · · · inhibitory (5.18)

xcx1i (t) = Ii(t) (5.19)

xcx2i (t) = f

(∑N

j=0wcx2·cx2

ij xcx2j (t − 1)

+ wcx2·cx1xcx1i (t) + wcx2·c1(t)xc1

i (t) − θcx)

(5.20)

here, Ii(t): ith input number, xcx1i (t): output of ith neuron in CX1,

xcx2i (t): output of ith neuron in CX2, wcx2·cx2

ij : weight of connection

Page 11: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 121

TIME= 0

TIME= 1

TIME= 2

TIME= 3

TIME= 4

TIME= 5

TIME= 6

TIME= 7

TIME= 8

TIME= 9

TIME= 10

TIME= 11

TIME= 12

TIME= 13

TIME= 14

TIME= 15

TIME= 16

TIME= 17

TIME= 18

TIME= 19

TIME= 20

TIME= 21

TIME= 22

TIME= 23

TIME= 24

TIME= 25

TIME= 26

TIME= 27

TIME= 28

TIME= 29

TIME= 30

TIME= 31

TIME= 32

TIME= 33

TIME= 34

TIME= 35

TIME= 36

TIME= 37

TIME= 38

TIME= 39

TIME= 40

Figure 5.7. Association results of CNN1 (left column) and CNN2 (right column).

Page 12: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

122 T. Kuremoto et al.

Figure 5.8. Structure of a chaotic model of hippocampus-neocortex.

from jth to ith neuron in CX2 (variable), wcx2·cx1: weight of connectionfrom CX1 to CX2 (fixed), wcx2·c1: weight of connection from CA1 toCX2 (fixed), xc1

i (t): output of ith neuron in CA1 in hippocampus, θcx:threshold, N : number of neurons in CX1 and CX2, f : step function.

The learning of connection weights in CX2 is according to Hebb rule:

∆wcx2·cx2ij = αhc · xcx2

i (t)xcx2j (t − 1) (5.21)

where αhc is a learning rate.

Hippocampus.

DGCompetition learning is executed in this layer. The input from asso-ciation cortex is exchanged into internal states (pattern-encoding).

k = arg maxi

N∑

j=0

wdg·cx2xcx2j (t) (5.22)

xdgi (t) =

{random · · · initialf

(∑Nj=0 wdg·cx1

ij xcx1j − θdg

)· · · usual (5.23)

Page 13: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 123

The learning rule of connection weight from CX2 to DG wdg·cx2ij

is,

∆wdg·cx1ij = βhc · xdg

i (t)xcx1j (t). (5.24)

Here, βhc is a constant, αhc < βhc.

CA3Feedback connections exist in CA3, and they result associationfunction like Hopfield model. Ito et al. noticed in this respect, how-ever just presented the dynamics of CA3 by a step function only.We suppose chaotic memorization phenomena exist in CA3, andapply MCNN which provides mutual association on CA3 layer.By learning of CA3 (self-feedback connections), the intermediatepatterns are formed.

k = arg maxi

n∑

j=0

wc3out·cnn1ij (2xcnn1

j (t) − 1) (5.25)

or

k = arg maxi

n∑

j=0

wc3out·cnn2ij (2xcnn2

j (t) − 1) (5.26)

xc3outi (t) =

{1 · · · (i = k)0 · · · (i �= k) (5.27)

where,wc3out·cnn1

ij : weight of connection from jth neuron of CNN1 layerto ith neuron of Output layer in CA3

wc3out·cnn2ij : weight of connection from jth neuron of CNN2 layer

to ith neuron of Output layer in CA3xcnn1

i (t) : output of ith neuron of CNN1 layer in CA3 (given by 5.6-5.9)xcnn2

i (t) : output of ith neuron of CNN2 layer in CA3 (given by 5.6-5.9)xc3out

i (t) : output of ith neuron of Output layer in CA3.

For time-series patterns are stored in MCNN alternatively, CNN1layer and CNN2 layer are excitative alternately, Eq. 5.25 and Eq.5.26 are adopted alternatively. This structure intends to result mu-tual association like successful behavior of bidirectional associativememory model (BAM) [22, 10, 27].

The rule of learning of self-feedback connection weights is

∆wc3·c3ij = βhc · xc3

i (t)xc3j (t − 1). (5.28)

Page 14: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

124 T. Kuremoto et al.

Here, βhc is as same as DG, n is number of neurons in hippocam-pus.

CA1Internal states in hippocampus is decoded into output patterns.The input from association cortex performs as a teacher signal.

xc1i (t) = f

⎝n∑

j=0

wc1·c3ij xc3

j (t) + wc1·c1xcx1j (t) − θc1

⎠. (5.29)

From CA3 to CA1, connections wc1·c3ij learn according to

∆wc1·c3ij = βhc · xc1

i (t)xc3j (t). (5.30)

Pattern

A

Pattern

B

Pattern

C

Pattern

DPattern

A

Pattern

E

Pattern

F

Pattern

G

Time Serie

s A

Time Series B

1

2

3

4

6

7

8

5

TIME

Figure 5.9. Time-series patterns for one-to-many retrieval.

Consolidation and Mutual Association Simulation

One-to-many Time-series PatternsWe define One-to-many time-series patterns retrieval as: there is a

same pattern exists in different time-series patterns, and by representing

Page 15: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 125

the pattern to associative memory model (proposed chaotic model ofhippocampus-neocortex, at here), all patterns in the different time-seriesare recalled as output of system. In detail, input patterns in two time-series is designed in this simulation, the first pattern of each time seriesis a common Pattern A (Fig. 5.9). Each time-series (Time Series Aand Time Series B) includes 4 orthogonal patterns. An input pattern,which is composed by 50 binary neurons, is presented temporally tomodel. When more than 1 interval to break the input temporally, thenthe input of a time-series pattern is finished, next input will be a newtime-series pattern (Fig. 5.10 - Fig. 5.12).

Simulation ProcessThe purpose and process of computer simulation of proposed model

is described as follow:

Input time-series patterns whether and how to be processed inmodel: To Cortex 1 layer, external stimulus is time-series pat-terns described in last section. DG transforms sensory pattern intointernal state in hippocampus. CA3(MCNN) compresses the sig-nals of DG, stores internal states in dynamical networks and out-puts in simple forms. CA1 decodes the signals from Output layerof MCNN(CA3).

Long-term memory whether and how to be formed in model: Torepeat to input a holding pattern, which is a common patternexists in different time-series, to form intermediate term memoryin CA3 and Cortex 2, and long-term memory becomes to be storedin Cortex 2 at last. The repetition stimulation can be consideredas long-term potentiation (LTP ) like-hood phenomenon which isobserved in brain.

One-to-many time-series patterns retrieval result: After differenttime-series patterns are presented, and a common pattern repre-sented, whether proposed chaotic model retrieves all patterns ornot.

ParametersParameters of proposed model in simulation is given as follow:

Page 16: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

126 T. Kuremoto et al.

III 0

: 1

: 2

: 3

: 4

: 5

: 6

: 7

: 8

: 9

:10

:11

:12

:13

:14

:15

:16

:17

:18

:19

:20

:21

:22

:23

:24

:25

:26

:27

:28

:29

:30

:31

:32

:33

:34

:35

:36

:37

:38

:39

:40

:41

:42

:43

:44

:45

:46

:47

:48

:49

:50

:51

:52

:53

:54

:55

:56

:57

:58

:59

:60

:61

:62

:63

:64

:65

:66

:67

:68

:69

:70

:71

:72

:73

:74

:75

:76

:77

:78

:79

:80

:81

:82

:83

:84

:85

:

Inpu

tD

GC

A3

( O

utpu

t lay

er o

f MC

NN

)C

A1

Cor

tex

2

I II IV V

MC

NN

(CN

N1)

MC

NN

(CN

N2)

Tim

e S

erie

s A

Tim

e S

erie

s B

Chaotic

retr

ieva

l of

time s

eries

A

Long

-ter

m m

emor

y

Tim

e se

ries

AT

IME S

igna

l Flo

w

Figure 5.10. Simulation result in case 1 (Time Series A is retrieved).

Page 17: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 127

III 0

: 1

: 2

: 3

: 4

: 5

: 6

: 7

: 8

: 9

:10

:11

:12

:13

:14

:15

:16

:17

:18

:19

:20

:21

:22

:23

:24

:25

:26

:27

:28

:29

:30

:31

:32

:33

:34

:35

:36

:37

:38

:39

:40

:41

:42

:43

:44

:45

:46

:47

:48

:49

:50

:51

:52

:53

:54

:55

:56

:57

:58

:59

:60

:61

:62

:63

:64

:65

:66

:67

:68

:69

:70

:71

:72

:73

:74

:75

:76

:77

:78

:79

:80

:81

:82

:83

:84

:85

:

Inpu

tD

GC

A3

( O

utpu

t lay

er o

f MC

NN

)C

A1

Cor

tex

2

I II IV V

MC

NN

(CN

N1)

MC

NN

(CN

N2)

Cha

otic

ret

rieva

l of t

ime

serie

s B

Tim

e se

ries

B

Long

-ter

m m

emor

y

TIM

E

Sig

nal F

low

Tim

e se

ries

B

Tim

e se

ries

A

Figure 5.11. Simulation result in case 2 (Time Series B is retrieved).

Page 18: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

128 T. Kuremoto et al.

III 0

: 1

: 2

: 3

: 4

: 5

: 6

: 7

: 8

: 9

:10

:11

:12

:13

:14

:15

:16

:17

:18

:19

:20

:21

:22

:23

:24

:25

:26

:27

:28

:29

:30

:31

:32

:33

:34

:35

:36

:37

:38

:39

:40

:41

:42

:43

:44

:45

:46

:47

:48

:49

:50

:51

:52

:53

:54

:55

:56

:57

:58

:59

:60

:61

:62

:63

:64

:65

:66

:67

:68

:69

:70

:71

:72

:73

:74

:75

:76

:77

:78

:79

:80

:81

:82

:83

:84

:85

:

Inpu

tD

GC

A3

( O

utpu

t Lay

er o

f MC

NN

)C

A1

Cor

tex

2

I II IV V

MC

NN

(CN

N1)

MC

NN

(CN

N2)

(Confusion: the same internal states)

Fai

led

Ret

rieva

l

TIM

E

The

sam

e en

codi

ng a

ppea

red

for

diffe

rent

inpu

t pat

tern

s

Sig

nal F

low

Tim

e S

erie

s B

Tim

e S

erie

s A

Def

icie

nt p

atte

rns

Long

-ter

m m

emro

y

Figure 5.12. Simulation result in case 3 (failed retrieval).

Page 19: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 129

N = 50 : number of neurons in association cortexn = 30 : number of neurons in hippocampus

wcx2·cx1ij = 1.0 : weight of connection from Cortex 1 to Cortex 2

wcx2·c1ij = 1.0 : weight of connection from hippocampus to Cortex 2αhc = 0.0015 : learning rate in association cortexβhc = 1.0 : learning rate in hippocampusθc3 = 0.5 : threshold of neuron in CA3 layerθc1 = 0.5 : threshold of neuron in CA1 layerS = 0.07 : maximum correlation between random patterns

θdg = 5.5 : threshold of neuron in DGε = 0.15 : slope of sigmoid functionγ = 0.3 : influential rate from other CNN layer in CA3

Results and AnalysisFig. 5.10 shows memory storage processes and retrieval results of pro-

posed model in computer simulation. Chaotic retrieval and consolidation(LTP) were able to be observed when an input pattern was presentedcontinuously. There were three kinds of patterns retrieved in 86 iter-ations when orthogonal patterns were presented as order as Fig. 5.9showed. The rate of different result is as:

result in case 1 (Fig. 5.10): Time Series A was stored as a long-term memory and retrieved when common pattern presented. Theprocess of encoding and decoding was also shown in the Fig. 5.10.

result in case 2 (Fig. 5.11): Time Series B was stored as a long-term memory and retrieved when common pattern presented.

result in case 3 (Fig. 5.12): Deficient time series patterns werestored confusedly, and failed to retrieval correct time-series patternwhen common pattern presented. The reason of confusion can beobserved on TIME I stage in CA3 (Output layer of MCNN), wheresame situation occurred for different time-series patterns signalsfrom DG.

The ratio of these different kinds of retrieval is shown in Tab. 5.1. Wealso repeated computer simulation of Ito et al. model using the sametime-series patterns (Tab. 5.1). Time Series B could not be stored aslong-term memory for confusion with Time Series A which was inputbefore.

4. ConclusionBy combining plural layers of classical chaotic neural networks, a

multi-layered chaotic neural network (MCNN) is constructed and in-troduced into a conventional hippocampus-cortex. The proposed chaotic

Page 20: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

130 T. Kuremoto et al.

Table 5.1. Simulation result: retrieval rate for one-to-many time-series patterns

Model 1: TimeSeriesA 2: TimeSeriesB 3: Failed

Ito et al.[19] 0.0 0.0 100.0Kuremoto et al.[23] 60.0 6.0 34.0

(unit%)

hippocampus-neocortex model is able to not only convert short-termmemory to long-term memory, but also realize mutual memorization andassociation for one-to-many time-series patterns. Computer simulationverified the efficiency of proposed model.

There are two problems of the model remained: the first one is theinvestigation for its sensitivity to noise patterns, the second one is thepossibility of its application on contex-dependent learning. Our artificialmodel of memory system is expected to be improved more rational in thefuture, because the development of neurophysiology and neuroanatomyhave been providing tremendous knowledges about hippocampus and itsrelated areas.

References

[1] M. Adachi: Influence of Parameter Deviations in an AssociativeChaotic Neural Network. In: Proceedings of IEEE InternationalConference on Neural Networks. Vol. 2 (2004) 921–924

[2] M. Adachi, K. Aihara: Associative dynamics in chaotic neuralnetwork. Neural Networks. Vol. 10 (1997) 83–98

[3] K. Aihara, T. Takabe, M. Toyoda: Chaotic neural networks.Physics Letters A. Vol. 144 (6, 7) (1990) 333–340

[4] O. Araki, K. Aihara: Dynamical neural network model for hip-pocampal memory. IEICE Transactions on Fundamentals of Elec-tronics, Communications and Computer Science. Vol. E81A (9)(1998) 1824–1832

[5] L. Berthouze, A. Tijsseling: A neural model for contex-dependentsequence learning. Neural Processing letters. Vol. 23 (2006) 27–45

[6] S. Du, Z. Chen, Z. Yuan, X. Zhang: Sensitivity to noise inbidirectional associative memory (BAM). IEEE Transactions onNeural Networks. Vol. 16 (4) (2005) 887–898

[7] N. J. Fortin, S. P. Wright, H. Eichenbaum: Recollection-likememory retrieval in rats dependent on the hippocampus. Nature,Vol. 431, (2004) 188–191

Page 21: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 131

[8] W. J. Freeman: Simulation of chaotic EEG patterns with a dynamicmodel of the olfactory system. Biological Cybernetics. Vol. 56 (1987)139–150

[9] R. E. Hampson, J. D. Simeral, S. A. Deadwyler: Distribution ofspatial and nonspatial information in dorsal hippocampus. Nature.Vol. 402 (1999) 610–614

[10] M. Hattori, M. Hagiwara: Multimodule associative memory formany-to-many associations. Neurocomputing. 19 (1998) 99–119

[11] Z. -Y. He, Y. -F. Zhang, L. -X. Yang: The Study of ChaoticNeural Network and Its Applications in Associative Memory. NeuralProcessing Letters. Vol. 9 (1999) 160–175

[12] D. O. Hebb: Organization of behavior. John Wiley & Sons, NewYork: Wiley (1949)

[13] O. Heekuck and C. Suresh: Adaptation of the relaxation methodfor learning in bidirectional associative memory. IEEE Transactionson Neural Networks. Vol. 5 (1994) 576–583

[14] J. J. Hopfield: Neural Networks and Physical Systems with Emer-gent Collective Computational Abilities. In: Proceedings of the Na-tional Academy of Sciences U.S.A.. Vol. 79 (1982) 2554–2558

[15] J. J. Hopfield: Neurons with Graded Response Have CollectiveComputational Properties Like Those of Two-State Neurons. In:Proceedings of the National Academy of Sciences U.S.A.. Vol. 81(1984) 3088–3092

[16] M. Inoue, A. Nagayoshi: A Chaos Neuro-Computer. PhysicsLetters A. Vol. 158 (8) (1991) 373–376

[17] S. Ishii, K. Fukumizu, S. Watanabe: A Network of ChaoticElements for Information Processing. Neural Networks. Vol. 9 (1)(1996) 25–40

[18] M. Ito, J. Kuroiwa, S. Miyake: A model of hippocampus-neocortexfor episodic memory. In: Proceedings of The 5th International Con-ference on Neural Information Processing. 1P-16 (1998) 431–434

[19] M. Ito, S. Miyake, S. Inawashiro, J. Kuroiwa, Y. Sawada: Long-term memory of temporal patterns in a hippocampus-cortex model.(in Japanese) Technical Report of IEICE. NLP2000-18 (2000) 25–32

[20] M. Ito, J. Kuroiwa, S. Miyake: A neural network model of memorysystem using hippocampus. Electronics and Communications inJapan. Part 3, Vol. 86 (6) (2003) 86–97

[21] K. Kaneko: Clustering, Coding, Switching, Hierarchical Ordering,And Control in a Network of Chaotic Elements. Physica D. Vol. 41(2) (1990) 137–172

Page 22: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

132 T. Kuremoto et al.

[22] B. Kosko: Bidirectional associative memories. IEEE Transactionson Systems, Man, and Cybernetics. Vol. 18 (1) (1998) 49–60

[23] T. Kuremoto, T. Eto, M. Obayashi, K. Kobayashi: A Multi-layered Chaotic Neural Network for Associative Memory. In Pro-ceedings of SICE Annual Conference, (2005) 1020–1023

[24] T. Kuremoto, T. Eto, M. Obayashi, K. Kobayashi: A ChaoticModel of Hippocampus-Neocortex. L. Wang, K. Chen, and Y. S.Ong (Eds.): Lecture Notes in Computer Science. Vol. 3610 (2005)439–448

[25] M. Kushibe, Y. Liu, J. Ohtsubo: Associative Memory with Spa-tiotemporal Chaos Control. Physical Review E. Vol. 53 (5) (1996)4502–4508

[26] R. S. T. Lee: A transient-chaotic autoassociative network (TCAN)based on Lee Oscillators. IEEE Transactions on Neural Networks.Vol. 15 (2004) 1228–1243

[27] C. S. Leung: Optimun learning for bidirectional associative memoryin the sense of capacity. IEEE Transactions on Systems, Man, andCybernetics. Vol. 24 (5) (1994) 791–796

[28] J. E. Lisman, L. M. Talamini, A. Raffone: Recall of memorysequences by interaction of the dentate and CA3: A revised modelof the phase precession. Neural Networks. Vol. 18 (2005) 1191–1201

[29] J. L. McClelland, N. H. Goddard: Considerations arising from acomplementary learning systems perspective on hippocampus andneocortex. Hippocampus. Vol. 6 (1996) 654–665

[30] B. Milner, S. Corkin, H. L. Teuber: Further analysis of thehippocampal amnesic syndrome: 14-year follow-up study of H. M..Neuropsychologia. Vol. 6 (1968) 317–338

[31] S. Mizutani, T. Sano, T. Uchiyama, N. Sonehara: ControllingChaos in Chaotic Neural Networks. Electronics and Communica-tions in Japan, Part III: Fundamental Electronics Science. Vol. 81(8) (1998) 73–82

[32] M. Morita: Computational study on the neural mechanism of se-quential pattern memory. Cognitive Brain Research. Vol. 5 (1–2)(1996) 137–146

[33] M. Morita: Hippocampus model of associative memmory. (inJapanese) Transactions of IEICE. Vol. J72-D-II (2) (1989) 279–288

[34] K. Nakano: Associatron – a model of associative memory. IEEETransactions on Systems, Man, and Cybernetics. Vol. 2 (3) (1972)380–388

Page 23: [Studies in Computational Intelligence] Trends in Neural Computation Volume 35 || A Hippocampus-Neocortex Model for Chaotic Association

A Hippocampus-Neocortex Model for Chaotic Association 133

[35] M. Obayashi, K. Watanabe, K. Kobayashi: Chaotic neural net-works with Radial Basis Functions and its application to memorysearch problem. (in Japanese) Transaction of The Institute of Elec-trical Engineers of Japan. Vol. 120-C (2000) 1441–1446

[36] M. Obayashi, K. Yuda, R. Omiya, K. Kobayashi: Associativememory and mutual information in a chaotic neural network intro-ducing function typed synaptic weights. (in Japanese) IEEJ Trans-action on Electronis, Information and Systems. Vol. 123 (2003)1631–1637

[37] M. Remondes, E. M. Schuman: Role for a cortical input to hip-pocampal area CA1 in the consolidation of a long-term memory.Nature. Vol. 431 (2004) 699–703

[38] C. A. Skarda, W. J. Freeman: How brains make chaos in orderto make sense of the world. Behavioral and Brain Sciences. Vol. 10(1986) 161–195

[39] D. S. Rizzuto, M. J. Kahana: An autoassociative neural networkmodel of paired-associate learning. Neural Computation. Vol. 13 (9)(2001) 2075–2092

[40] E. T. Rolls: A theory of hippocampal function in memory. Hip-pocampus. Vol. 6 (1996) 601–620

[41] I. Tsuda: Can stochastic renewal of maps be a model for cere-bral cortex? Physica D: Nonlinear Phenomena. Vol. 75, Issues 1–3,(1994) 165–178

[42] I. Tsuda: Dynamic link of memory – chaotic memory map innonequilibrium neural networks. Neural Networks. Vol. 5 (1992)313–326

[43] L. P. Wang, S. Li, F. Y. Tian, X. J. Fu: A Noisy Chaotic NeuralNetwork for Solving Combinatorial Optimization Problems: Sto-chastic Chaotic Simulated Annealing. IEEE Transaction on Sys-tem, Man, Cybernetics., Part B - Cybernetics. 34 (2004) 2119–2125

[44] M. Yoshida, H. Hayashi, K. Tateno, S. Ishizuka: Stochasticresonance in the hippocampal CA3–CA1 model: a possible memoryrecall mechanism. Neural Networks. Vol. 15 (2002) 1171–1183

[45] X. S. Yang, Q. Yuan: Chaos and transient chaos in simple Hopfieldneural networks. Neurocomputing. Vol. 69 (2005) 232–241