modular neuron comprises of memristor-based synapse

12
See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/281406898 Modular neuron comprises of memristor-based synapse Article in Neural Computing and Applications · September 2015 DOI: 10.1007/s00521-015-2047-0 CITATIONS 9 READS 350 4 authors, including: Some of the authors of this publication are also working on these related projects: Security and Trust for IoT View project Digital Communication View project Jafar Shamsi Iran University of Science and Technology 13 PUBLICATIONS 51 CITATIONS SEE PROFILE Sattar Mirzakuchaki Iran University of Science and Technology 165 PUBLICATIONS 1,113 CITATIONS SEE PROFILE Majid Ahmadi University of Windsor 565 PUBLICATIONS 4,607 CITATIONS SEE PROFILE All content following this page was uploaded by Sattar Mirzakuchaki on 22 December 2015. The user has requested enhancement of the downloaded file.

Upload: others

Post on 11-Dec-2021

8 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Modular neuron comprises of memristor-based synapse

See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/281406898

Modular neuron comprises of memristor-based synapse

Article  in  Neural Computing and Applications · September 2015

DOI: 10.1007/s00521-015-2047-0

CITATIONS

9READS

350

4 authors, including:

Some of the authors of this publication are also working on these related projects:

Security and Trust for IoT View project

Digital Communication View project

Jafar Shamsi

Iran University of Science and Technology

13 PUBLICATIONS   51 CITATIONS   

SEE PROFILE

Sattar Mirzakuchaki

Iran University of Science and Technology

165 PUBLICATIONS   1,113 CITATIONS   

SEE PROFILE

Majid Ahmadi

University of Windsor

565 PUBLICATIONS   4,607 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Sattar Mirzakuchaki on 22 December 2015.

The user has requested enhancement of the downloaded file.

Page 2: Modular neuron comprises of memristor-based synapse

REVIEW

Modular neuron comprises of memristor-based synapse

Jafar Shamsi1 • Amirali Amirsoleimani2 • Sattar Mirzakuchaki1 • Majid Ahmadi2

Received: 4 September 2014 / Accepted: 19 August 2015

� The Natural Computing Applications Forum 2015

Abstract Design of analog modular neuron based on

memristor is proposed here. Since neural networks are built

by repetition of basic blocks that are called neurons, using

modular neurons is essential for the neural network hard-

ware. In this work modularity of the neuron is achieved

through distributed neurons structure. Some major chal-

lenges in implementation of synaptic operation are weight

programmability, weight multiplication by input signal and

nonvolatile weight storage. Introduction of memristor bridge

synapse addresses all of these challenges. The proposed

neuron is a modular neuron based on distributed neuron

structure which it uses the benefits of the memristor bridge

synapse for synaptic operations. In order to test appropriate

operation of the proposed neuron, it is used in a real-world

application of neural network. Off-chip method is used to

train the neural network. The results show 86.7 % correct

classification and about 0.0695 mean square error for 4-5-3

neural network based on proposed modular neuron.

Keywords Artificial neural network � Synapticoperation � Distributed neuron � Memristor bridge synapse

1 Introduction

The digital, analog and mixed-signal circuits are different

possible approach for implementation of the artificial

neural networks (ANNs) hardware [1]. Although digital

implementation has higher accuracy, it is slow and has high

power and area consumption. On the other hand, analog

implementation is fast and meets the criteria of minimal

area. The main drawback of analog circuits is low accuracy

due to mismatch of the devices [2]. Analog circuits have

some advantages such as low area and low power con-

sumption, and also implementation of some math opera-

tions is achieved through the inherent physical processes

such as collecting currents in a node as the summation

operation. In other words, by considering the Kirchhoff’s

current law (KCL) in the electrical circuit theory, the

output current of a node is sum of the input currents and it

can be used for implementation of summation operation in

analog circuits.

One of the main challenges in analog ANN circuits is

implementation of programmable synapses as their weight

values are adjusted through programming. For implemen-

tation of the synaptic weights in analog circuits, resistors,

capacitors and floating gates are used [3]. Each of these

elements has some drawbacks. The resistor’s static

behavior makes it impossible to be changed or trained.

Also a capacitor cannot store weights for long periods of

time because of its permanent charge leakage; floating gate

transistors are successful alternative for working as

synapses of neurons in analog circuits which are used to

implement Hebbian-based and STDP learning rules. Low

area consumption in implementation of STDP is another

advantage of floating gates. Their main drawback is their

high nonlinearity which causes a problem in multiplying

weights and for adjusting of the synaptic weight value they

need extra circuits to drive the tunneling and hot electron

injection [4]. Emerging memristor as a nonvolatile resistor

opens a new vision in neural network hardware imple-

mentation. The memristor idea was presented by Chua in

1971 [5]. Subsequently in 2008 at HP laboratory this

missing element was discovered [6]. Memristor or memory

& Amirali Amirsoleimani

[email protected]

1 Iran University of Science and Technology, Tehran, Iran

2 University of Windsor, Windsor, Canada

123

Neural Comput & Applic

DOI 10.1007/s00521-015-2047-0

Page 3: Modular neuron comprises of memristor-based synapse

resistor is a two-port element with its memristance being

proportional to the amount and polarity of current passing

through it. Several neural network circuits based on

memristor were presented in the literature since its dis-

covery. This highlights the impacts of memristor’s emer-

gence on neural network hardware implementation. In [7]

the similarity between memristor and a biological synapse

was investigated. In [8] a cellular neural network (CNN)

based on memristor is implemented. In [9] a hybrid CMOS

memristor-based CNN was designed. Also memristor

bridge circuit presented in [10] includes four memristors

and three transistors. This circuit can produce negative,

zero and positive synaptic weights which is also used in

this work as synapse of the proposed modular neuron.

Moreover, in distributed neurons introduced in [11]

there is an activation function for each input signal rather

than having one activation function for all neuron’s input

signals. Each activation function is implemented by a

nonlinear load, and their outputs are connected to each

other to form a multi-input neuron. Modularity is one of the

significant features of distributed neurons [11, 12]. Since

neural network is comprised of replicated basic element of

neuron and synapse, using the modular neurons is essential

for the neural network hardware.

In this paper by using the benefits of memristor bridge

circuits and distributed neuron, a modular neuron com-

prised of memristor-based synapse is proposed. Memristor

bridge circuit is used for synaptic operation, and distributed

neuron structure is applied to make it a modular circuit.

The paper is organized as follows. Memristor theory and

its basic relations are mentioned in Sect. 2. Explanation of

the proposed circuit and its simulations in HSPICE� are

presented in Sect. 3. In Sect. 4, off-chip method is used for

training of a neural network based on the proposed modular

neuron and its simulation results are presented. This is

followed by concluding remarks in Sect. 5.

2 Memristor

The memristor is a metal–insulator–metal (MIM) structure.

The HP memristor is consisted of two platinum (Pt) elec-

trodes and titanium dioxide (TiO2) layer between them. One

side of the insulator layer is doped and consisted of oxygen

deficiency titanium dioxide (TiO2-x). This side is having

higher conductivity in comparison with the other side. The

linear model for HP’s memristor structure is shown in

Fig. 1a. The internal state variable (x) of the memristor

shows the position of the boundary region between doped

and undoped zones. Also when x is decreased, the conduc-

tance of the channel decreases proportionally.

Memristor creates a relationship between flux and

charge;

MðqÞ ¼ dudq

ð1Þ

where M(q) is the memristance of the memristor. When

charge and flux relation is flux-controlled, the equation will

be,

WðuÞ ¼ dq

duð2Þ

where W(u) is the memductance of the memristor.

Therefore, the current–voltage relationship is determined

by,

V ¼ M qð Þ � I ð3ÞI ¼ W uð Þ � V ð4Þ

In [13] a linear boundary drift model is introduced by

applying window function. In this model velocity of

boundary region between doped and undoped zone of TiO2

thin film is determined through,

(a) (b)

TiO2

TiO2-x

ROFF

RON

Pt

Pt

D

x

TiO2

Pt

Pt

w

Fig. 1 a HP’s Pt/TiO2/Pt memristor structure is shown for linear

model in [13]. The thin film length is considered D, and the position

of boundary region along the TiO2 thin film is x. As it can be seen the

resistance of the doped region (TiO2-x) is shown by an equivalent

resistor RON and the resistance of the undoped region (TiO2) is

considered by an equivalent resistor ROFF. b The Pt/TiO2/Pt

memristor structure is presented for physical model in [21]. The

tunneling distance is shown by w

Neural Comput & Applic

123

Page 4: Modular neuron comprises of memristor-based synapse

vD ¼ glDRON

D� iðtÞ � f ðxÞ ð5Þ

where g and lD are the polarity of memristor and mobility

of dopants, respectively. Also D and RON are thickness and

lowest resistance of the memristor device. The current i(t)

is passing current through the memristor, and window

function f(x) is used to insert nonlinearity into the mem-

ristor’s behavior [14]. This window function is,

f ðxÞ ¼ 1� 2x

D� 1

� �2p

ð6Þ

where p is a parameter for applying nonlinearity to mem-

ristor model. The memristance of the memristor is defined

by,

MðxÞ ¼ RON

x

D

� �þ ROFF 1� x

D

� �ð7Þ

Although the mentioned assumption for memristor

model exhibited basic memristor device characteristics, it

cannot generate real memristor device behavior [20]. In

[21] a model based on tunneling phenomenon is presented

by the structure shown in Fig. 1b. While this model shows

more accurate behavior in comparison with experimental

data extracted from a real memristor device, it is a com-

plicated model in terms of computation. The more sim-

plified model is presented in [22] based on the same

physical assumption of memristor in [21]. This model is

considered threshold voltage for memristor devices. In this

model state variable is defined by,

dw tð Þdt

¼kOFF

i tð ÞiOFF

� �aOFF

fOFF wð Þ 0\ iOFF \ i

0 iON \ i\ iOFF

kONi tð ÞiON

� �aON

fON wð Þ i\ iON \ 0

8>>>><>>>>:

ð8Þ

where iON and iOFF are the current thresholds and kOFF,

kON, aOFF and aON are constants. The effective electric

tunnel width, which corresponds to an internal state vari-

able, is represented by w. The window functions are fONand fOFF which limit the internal state variable to the [wON,

wOFF] interval. Like [21] asymmetric behavior in state

variable dependence, the fON and fOFF may have different

values. The i–v relationship in this model is,

vt ¼ RON e k= wOFF�wONð Þð Þ w�wONð Þi tð Þ ð9Þ

where k ¼ ln ROFF

RON

� �where k is a fitting parameter, which is

the proportion of the highest resistance (HRS) mode to the

lowest resistance (LRS) of the device. The Pt/TiO2/Pt

memristor device simulated in [23] is considered for the

simulation of the memristor bridge synapse in the modular

neuron. The i–v curve and memristance of simulated

memristor are shown in Fig. 2 for both linear model of

memristor [13] by utilizing window function [14] and the

TEAM model in [22] through MATLAB�.

3 Proposed design

Artificial neural networks have a massive connection of

neurons. McCulloch–Pitts model of a neuron is demon-

strated in Fig. 3a. The main parts of this model are: (1)

synapses that multiply their weights by input signals xi, (2)

summation operation for summing weighted input signals

and (3) activation function which is applied on the result of

summation. Neuron output relation is:

O ¼Xni¼1

wixi þ h ð10Þ

where h is bias, wi is synapse weight, xi is input signal, and

O is output of the neuron. Equivalent block diagram of

distributed neuron structure is also demonstrated in Fig. 3b.

An N input neuron has N neurons with one input and one

output, and the outputs are being connected to each other

[11]. The modularity of distributed neuron is an obvious

feature which has N similar neurons. All of the signals in

this block diagram are shown as voltage or current. The

synapse block operation is to multiply weights by input

signals. The voltage to current convertor changes weighted

input voltage signals to current. This conversion is due to

the simple nature of implementing current summation

function. Therefore, in this stage current signals are sum-

med and then imposed into a nonlinear load that behaves

like an activation function.

For implementing synapse of the proposed modular

neuron, the memristor bridge circuit [10] is used. This

circuit is like the famous Wheatstone bridge. As shown in

Fig. 4, it is implemented by four memristors with prede-

fined polarities.

Synaptic operation includes programming ability for

changing synaptic weights, nonvolatile storing ability for

synaptic weights and multiplication ability for multiplying

weights by input signals. The three mentioned character-

istics are fulfilled through memristor bridge synapse. For

testing programmability of this synapse circuit, input signal

is imposed for a specific time which is proportional to the

weight considered for the synapse. Since memristor is a

history-dependent device, the allocated weight for the

synapse can be stored in memristor bridge circuit. The

output voltage relationship with input voltage in the pro-

posed synapse circuit can be determined by [10],

vout ¼ b� Vin ð11Þ

b ¼ M2

M1 þM2

� M4

M3 þM4

ð12Þ

Neural Comput & Applic

123

Page 5: Modular neuron comprises of memristor-based synapse

(a) (b)

Fig. 2 a Linear boundary drift model [13] is used for simulating I–

V curve (top figure) and memristance versus voltage (below figure)

for Pt/TiO2/Pt memristor. The model parameters were set to:

D = 3 nm, Rinit = 100 kX, RON = 100 X, ROFF = 200 kX,lD = 10-14 m2 s-1 V-1, p = 2. These curves are for 1 V, 1 kHz

triangular wave input voltage. b TEAM model [22] is used for

simulating I–V curve (top figure) and memristance versus voltage

(below figure) for Pt/TiO2/Pt memristor. The model parameters were

set to: aON = 3, aOFF = 3, kON = -8e-13 nm/s, kOFF = 8e-13 nm/s,

iON = 8.9 lA, iOFF = 115 lA. These curves are for 3 mA, 1 Hz

sinusoidal wave input current

Fig. 3 Neuron and synapse are

displayed. a McCulloch–Pitts

model. b Equivalent block

diagram of distributed neuron

Neural Comput & Applic

123

Page 6: Modular neuron comprises of memristor-based synapse

where b is synaptic weight. M1, M2, M3 and M4 are

memristances of memristors. In training phase for pro-

gramming the synaptic weight, different input signals are

utilized for programming synapse with different memristor

models. For example by using linear boundary drift model,

voltage pulse width applied to synapse is proportional to

the weight considered for synapse, while in TEAM model

the voltage pulse width is not proportional to weight con-

sidered for synapse. The memristor’s memristance behav-

ior and weight of the proposed synapse are shown in Fig. 5

for programming phase. Figure 5a, b shows simulation

based on linear boundary drift model and TEAM model,

respectively. As it can be seen in Fig. 5 for memristor

bridge synapse circuit simulated with linear model for Pt/

TiO2/Pt device applied in [13], M1 and M4 change their

states from HRS to LRS while M2 and M3 change their

states from LRS to HRS. For this case a 2 V pulse input

voltage is applied as input signal. In the other case by

applying ramp input current with the slope of 0.16e-03 for

a memristor bridge circuit simulated with TEAM model,

the resistive states of M2 and M3 do not change very much.

Although memristance of the M2 and M3 does not change

considerably, changing in memristance of M1 and M4

devices causes the memristor bridge circuit to change its

weight (b) from -1 to 1.

As input current ramp is applied to memristor bridge

circuit,M1 andM4 memristor devices are starting to change

their memristances and the currents in them reach to iON(8.9 lA). While the current reaches to iON, M1 and M4

memristances are changing and M2 and M3 memristances

in memristor bridge circuit are not changing until the

current reaches to iOFF (115 lA). For reaching iOFF current,M1 and M4 should change their resistive state completely.

Then the currents in the branches become greater than iOFF,

and this causes M2 and M3 to change their memristance as

well. This method has a drawback in initial moments of

programming with TEAM model. For example as it can be

seen in Fig. 5b after 2 s, M1 and M4 voltages become more

than 100 V which is not practicable and this will possibly

cause breakdown and failure in device. For solving this

problem, in order to program the memristors which satisfy

Fig. 4 Memristor bridge synapse circuit [10]

0 0.5 1 1.5 2 2.5 3 3.5 40

0.51

1.52

2.5

Time(s)

V in(v

)

0 0.5 1 1.5 2 2.5 3 3.5 40

0.5

1

1.5

2x 10

5

Time(s)

Mem

rista

nce(

Ω)

M1,M4M2,M3

0 0.5 1 1.5 2 2.5 3 3.5 4-1

-0.5

0

0.5

1

Time(s)

Syna

pse

Wei

ght

0

100

200

VM

1,VM

4(v)

Time(s)0 1 2 3 4 5 6 7 8 9 10

0

0.005

0.01

I in(A

)

0 1 2 3 4 5 6 7 8 9 100

1

2x 10

5

Time(s)

Mem

rist

ance

( Ω)

M1,M4M2,M3

0 1 2 3 4 5 6 7 8 9 10

-1

0

1

Time(s)

Syna

pse

Wei

ght

(a) (b)

Fig. 5 a Programming of memristor bridge synapse based on linear

boundary drift model [13] by applying window function [14]. The

input signal is 2 V pulsed voltage. b Programming of memristor

bridge synapse based on TEAM model [22]. The input signal is a

ramp input current with the slope of 0.16e-03. The upper figures

display applied input signal to the bridge synapse. The middle figures

show memristance alteration of memristor M1 and M4 versus M2 and

M3. The below figures show weight value alternation of the memristor

bridge synapse

Neural Comput & Applic

123

Page 7: Modular neuron comprises of memristor-based synapse

the TEAM model another approach is utilized. As TEAM

model is utilized for simulation of memristor bridge

synapse, in initial moments of programming when mem-

ristances of M1 and M4 are started to decrease, the weight

of the synapse is not changed. In addition high voltage

value is needed for having the desirable change in synapse

weight at this stage as it can be seen in red zone of Fig. 5b.

Therefore, by omitting this zone in programming procedure

not only the programming time for synapse will be reduced

but also there will be no need for a considerable high

voltage for programming of synapse in this stage.

For omitting this zone in programming procedure, all

memristors’ initial memristances should be set to RON by

RESET procedure at first. In the next step for generating a

positive or negative weight in synapse, a positive or neg-

ative ramp current should be applied to synapse, respec-

tively. The proposed programming technique is utilized in

Fig. 6a, b for negative and positive weight programming of

synapse with TEAM model, respectively.

Also in RESET procedure for initializing all memristor

devices to RON, same voltages are applied to all memris-

tors. For this issue a negative voltage (-5.5 V) is applied

to node A and positive voltage (?5.5 V) is imposed to

node B of memristor bidge synapse in Fig. 4. The input

node of the synapse should be connected to the ground. In

this way 5.5 V DC is applied to all of the four memristors.

The RESET procedure is done just once for each pro-

gramming procedure as the first step. The RESET proce-

dure is displayed in Fig. 7 for memristor device simulated

by TEAM model with different initial memristances. As it

can be seen in Fig. 7, all devices with different initial

memristances are reached to RON at the end of the applied

DC voltage.

In operation phase a narrow width pulse is applied to the

memristor bridge synapse because the memristor’s state

should not be changed in this phase. Here, the pulse width

is considered as 10 ns since it does not change the mem-

ristor’s state. When narrow width pulse is applied, the

memristor acts as a resistor, and thus in operation phase a

resistor is used instead of memristor.

The operation of the memristor bridge synapse is sim-

ulated for five different input signals in Fig. 8. The linear

relationship between output voltage and input voltage of

the synapse can be seen for different weights.

Considering the neuron structure in Fig. 3b, after mul-

tiplying synaptic weights to the input signal, the voltage

should be converted to current by voltage to current con-

vertor which is illustrated in Fig. 9. The input is in dif-

ferential mode because of differential output of the

synapse.

The CMOS transistor parameters in 180 nm CMOS

technology, resistor’s resistance and voltage sources values

are shown in Table 1. The output current relationship with

input voltage can be determined by:

-4

-2

0

V M1,V

M4(

v)

Time(s)0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

-0.01

-0.005

0

I i n(A

)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 50

500

1000

Time(s)

Mem

rista

nce(

Ω)

M1,M4M2,M3

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-1

-0.50

0.5

1

Time(s)

Syna

pse

Wei

ght

VM1,VM4Iin 0

2

4

V M2,V

M3(

v)

Time(s)0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

0

0.005

0.01

I i n(A

)0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

0

500

1000

Time(s)

Mem

rista

nce(

Ω)

M1,M4M2,M3

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-1

-0.50

0.5

1

Time(s)

Syna

pse

Wei

ght

IinVM2,VM3

(a) (b)

Fig. 6 a Programming memristor bridge synapse with the proposed

technique to negative weight by using TEAM model [22]. b Pro-

gramming memristor bridge synapse with the proposed technique to

positive weight by using TEAM model [22]. The input signal is a

ramp input current with the slope of 0.16e-03. The dashed green

curves display applied input current signal to the bridge synapse in the

top figures. The middle figures show memristance alteration of

memristor M1 and M4 versus M2 and M3. The below figures show

weight value alternation of the memristor bridge synapse (color figure

online)

Neural Comput & Applic

123

Page 8: Modular neuron comprises of memristor-based synapse

Iout ¼gm

1þ gm � R

� �:vin ð13Þ

where R is the resistances in source of Q1 and Q2 and gm is

the transconductance of Q1 and Q2. A nonlinear load is

used as the output load, whereas the combination of the

voltage to current convertor and the nonlinear load serves

as an activation function.

Activation function is one of the most important ingre-

dients of a neural network. It defines output of the neuron

to a given weighted input. After the signal conversion the

current acts as activation function. Several activation

functions are used in neurons such as hyperbolic tangent,

step function and sigmoid function. Sigmoid and hyper-

bolic tangent functions are S-curved nonlinear functions. In

[12] a nonlinear load is used for implementation of an

activation function of a distributed neuron. This nonlinear

neuron is comprised of two PMOS/NMOS transistors and

one resistor. Also it requires four voltage biases. In [15] the

nonlinear load in previous design in [12] has been revised.

It has replaced two resistors with two transistors. It also

consumes less power and needs three voltage biases for

implementing sigmoid function. In [16] the number of

sources is decreased to one for sigmoid activation function.

Here a novel nonlinear load is used for implementing a

function similar to hyperbolic tangent function. This circuit

is shown in Fig. 10. Also the transistor parameters are

shown for 180 nm CMOS technology in Table 2. The

proposed nonlinear load is fed through input current.

Therefore, it does not require any bias voltage. Its input

current is the output of the voltage to current convertor

unit. The proposed neuron operates in four regions sum-

marized in Table 3. Each region’s current is determined

according to operation of the transistors in that region.

In region II, low input current causes a low output

positive voltage, and as a result, Q6 and Q7 are in triode and

off regions, respectively. The output voltage in this region

is defined as follows:

0 10 20 30 40 50 60 70 80 90 1004.5

5

5.5

6

6.5

Time(s)

App

lied

Vol

tage

(v)

0 10 20 30 40 50 60 70 80 90 1000

0.5

1

1.5

2x 10

5

Time(s)

Mem

rista

nce(

Ω)

Rinit=ROFFRinit=0.9*ROFFRinit=0.8*ROFFRinit=0.7*ROFFRinit=0.6*ROFF

Fig. 7 RESET procedure for Pt/TiO2/Pt device simulated by TEAM

model [23] with different initial memristances

-0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

Vin

(v)

V out(v

)

Weight=1Weight=0.5Weight=0Weight=-0.5Weight=-1

Fig. 8 Vout versus Vin of the memristor bridge synapse for various

values of synaptic weight and input voltage where for each value of

input voltage the narrow width pulse is used in operation phase. The

pulse width of 10 ns is used here in operation phase

Fig. 9 Voltage to current convertor circuit

Neural Comput & Applic

123

Page 9: Modular neuron comprises of memristor-based synapse

Iin ¼Vout

RT

ð14Þ

where RT is resistance of Q6 in triode region. Region III is

similar to region II where Q7 and Q6 are in triode and off

regions, respectively.

The output voltage increases due to the input current

increment until output voltage reaches Vtn value (threshold

voltage of Q6). Consequently, if the input current increases,

it causes a change in operation from region III to region IV.

In this region Q6 is saturated and Q7 is off and current is

defined as follows:

Iin ¼1

2knSnðVout � VtnÞ2ð1þ kVoutÞ ð15Þ

where Sn is defined as W6L6

� �.

Decreasing input current to negative value (changing the

current direction) and reaching to an output voltage of Vtp

(threshold voltage of Q6) cause a change in operating

region of circuit from region II to region I. In this region,

Q7 is saturated and Q6 is off and current is defined as

follows:

Iin ¼1

2kPSPðVout � VtpÞ2ð1þ kVoutÞ ð16Þ

where Sp is defined as W7L7

� �.

To solve (14)–(16) for Vout, for ease of calculation,

channel length modulation effect is ignored (k = 0). The

following equations show the output voltage in each region

[Eqs. (17)–(19) are related to regions I, II (or III) and IV,

respectively]:

Vout ¼ Vtp þffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffið2IinÞ=ðkpSpÞ

qð17Þ

Vout ¼ RT � Iin ð18Þ

Vout ¼ Vtn þffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffið2IinÞ=ðknSnÞ

pð19Þ

In Fig. 11 output voltage of the activation function is

simulated for input current.

As it can be seen in Fig. 12, by connecting mentioned

units a modular neuron comprised of a memristor-based

synapse is created. Input voltage is weighted by memristor

bridge synapse and transforms to current by voltage to

current convertor. Subsequently the created current goes

through nonlinear load. In Fig. 13 output of the proposed

neuron is depicted for pulsed 0.5 V input voltage with

10 ns width for zero, negative and positive weights. This

Table 1 Parameters of transistors and resistors and voltage sources of voltage to current convertor circuit

Transistors parameters (lm) Resistors (kX) Voltage sources (V)

W1L1

� �¼ 1:5

0:2W2L2

� �¼ 1:5

0:2W3L3

� �¼ 1:7

0:3W4L4

� �¼ 1:7

0:3W5L5

� �¼ 9

0:5RS1 = 2 RS1 = 2 VDD = 0.9 VSS = -0.9

Fig. 10 Proposed passive nonlinear load

Table 2 Transistor parameters

of nonlinear loadTransistors parameter (lm)

W6L6

� �4:50:18

W7L7

� �100:18

Table 3 Operation region of neuron circuit

Regions Vout Q6 Q7

I Vout\Vtp OFF Saturation

II Vtp\Vout\ 0 OFF Triode

III 0\Vout\Vtn Triode OFF

IV Vtn\Vout Saturation OFF

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2x 10

-5

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

Iin

(A)V ou

t(v)

Fig. 11 Nonlinear load I–V curve as an activation function

Neural Comput & Applic

123

Page 10: Modular neuron comprises of memristor-based synapse

neuron has one input and one output. The modularity is one

of the main features of these types of neurons.

This means for creating multiple input one output neu-

rons, this modular neuron is repeated in parallel and output

of these modular neurons should connect to each other.

Therefore, a distributed neuron is produced by the pro-

posed modular neuron. The output current, summation of

all the neuron output currents, is applied to each nonlinear

load. It is determined by:

ISUM ¼XNk¼1

gm

1þ gm � R

� �� wk � vin�k ð20Þ

where ISUM is collected current in output node, N is number

of modular neurons connected to each other, wk is synapse

weight and vin�k is input voltage of kth modular neuron.

4 Training and simulation results

In order to test the proposed neuron operation, it is used in

a real neural network application. The data set which

models the psychological experimental results is used as a

classification problem to test its applicability in real neural

networks [17]. The data set has three classes and four

attributes. Each sample is classified as having the balance

scale tip to the right, tip to the left, or be balanced. The

attributes are the left weight, the left distance, the right

weight and the right distance. There are 625 samples which

are divided into 500 and 125 samples as training data and

test data, respectively.

The proposed neural network structure has four inputs,

five neurons in hidden layer and three neurons in output

layer as it is shown in Fig. 14. Each neuron is based on

modular neuron with distributed structure. Therefore, there

are N modular neurons for each neuron where N is the

number of inputs. This results 20 modular neurons in

hidden layer and 15 for output layer.

There are three methods for training of hardware neural

network (HNN): on-chip, off-chip and chip-in-loop [18].

In on-chip training method, extra training circuitry on the

chip is used to calculate and update the synaptic weights.

Its drawback is area consumption of extra required cir-

cuits for infrequent training phase. For off-chip method,

all calculations of training are performed in computer and

final calculated weights are downloaded to the chip.

Fig. 12 Proposed modular

neuron

0 0.5 1 1.5 2 2.5 3x 10

-8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

Time(s)

V out(v

)

Weight=-1Weight=-0.75Weight=-0.5Weight=-0.25Weight=0Weight=0.25Weight=0.5Weight=0.75Weight=1

Fig. 13 Positive, zero and negative weight values are considered for

synapse and output of the modular neuron for 1 V input pulse with

10 ns duration is simulated

Neural Comput & Applic

123

Page 11: Modular neuron comprises of memristor-based synapse

Because of high precision in computer calculation, the

chip has lower performance than it is expected. In chip-in-

loop method, initial weights are downloaded to the

synapses in the chip and outputs of neural network in the

chip return back to the computer. Then synaptic weights

are calculated and updated in computer based on achieved

outputs of the chip. This procedure is repeated periodi-

cally until the expected performance of the neural net-

work in the chip is achieved.

For training neural network, off-chip method is applied.

In this method, all training procedure is performed in the

host computer. Training data set is used in this step. As it

was mentioned, the chip performance degradation is the

drawback in off-chip training method. To overcome this

problem, a training method based on the modified chip-in-

loop training [3] and HSPICE–MATLAB co-simulation

[19] is used here. Modified chip-in-loop training has three

main steps: (1) Neural network on computer is trained and

synaptic weights are downloaded to the chip, (2) outputs of

all neurons are captured from the chip and then stored in

computer memory, and (3) each layer of neural network is

trained separately in chip-in-loop fashion (by using the host

computer and the chip). All neurons’ outputs of the chip

are used in this algorithm which is the major drawback of

this method and needs special consideration in designing of

the chip. This algorithm is used in this work, but instead of

chip outputs, HSPICE� simulation result is used. HSPICE–

MATLAB co-simulation is used here for training of the

neural network.

At first epoch, the neural network is trained in

MATLAB� using back propagation algorithm. Then

synaptic weight values are mapped to [-1, 1] domain in

order to applying them to the memristor bridge synapse

circuits. HSPICE� simulation is performed, and all neurons

output of the neural network are returned back to the

MATLAB�. After this step, in chip-in-loop fashion, back

propagation algorithm is applied to each layer of neural

network separately and calculated synaptic weights are

mapped to [-1, 1], and then they are applied to the

memristor bridge synapse. This procedure is repeated

periodically in each epoch until the expected performance

of chip is achieved. Figure 15 displays the steps of the

training method. Since all training procedures are per-

formed in computer, this can be considered as off-chip

training method. By using this method, the proposed neural

network is trained in thirty-eight epochs.

After training completion, produced weights are applied

to the proposed neural network modeled in HSPICE�. Test

data set is used in this step. In order to analyze neural

network performance, the outputs extracted from HSPICE�

simulation are fed again to MATLAB� software for

comparing with expected outputs in the dataset. Mean

square error (MSE) of the simulation is 0.0695. Recogni-

tion rate of classes for this neural network is 86.7 %. The

summary of the proposed neural network and its simulation

results are illustrated in Table 4.

X1

X2

N1

N2

N3

N4

N5

N6

N7

N8

Right

Balanced

Left

Fig. 14 ANN architecture for classification

Training Using Back Propagation Algorithm in

MATLAB

Mapping the Weights to [-1,1] and Applying to

Memristor Bridge Synapse and Simulation

in HSPICE

Return Back the Simulation Results to

MATLAB

Back Propagation Algorithm in MATLAB for

Each Layer Separately

Expected Performance?

Training Complete

No

Yes

Fig. 15 Training flowchart of HSPICE–MATLAB co-simulation

Neural Comput & Applic

123

Page 12: Modular neuron comprises of memristor-based synapse

5 Conclusion

In this paper, a modular neuron is designed using a mem-

ristor-based synapse. Also it is used in a real neural net-

work application. The proposed neural cell includes three

main units, namely a memristor bridge synapse, voltage to

current convertor and nonlinear load. The memristor bridge

synapse enables this neuron to produce negative, zero and

positive weights. The voltage to current convertor changes

weighted voltage signals to current. Then currents are

summed and go through a nonlinear load as an activation

function. The off-chip learning method is used for the

proposed neuron applicability test in real neural network.

The result shows 86.7 % recognition rate and 0.0695 mean

square error for its learning.

References

1. Misra J, Saha I (2010) Artificial neural networks in hardware: a

survey of two decades of progress. Neurocomputing 74:239–255

2. Eberhardt S, Duong T, Thakoor A (1989) Design of parallel

hardware neural network systems from custom analog VLSI

‘building block’ chips. In: Conference on neural networks 1989

IJCNN international joint

3. Adhikari SP, Yang C, Kim H, Chua LO (2012) Memristor bridge

synapse-based neural network and its learning. IEEE Trans

Neural Netw Learn Syst 23(9):1426–1435

4. Rahimi Azghadi M, Iannella N, Al-Sarawi SF, Indiveri G, Abbott

G (2014) Spike-based synaptic plasticity in silicon: design,

implementation, application, and challenges. In: Proceedings of

the IEEE, vol 102(5), pp 717–737

5. Chua LO (1971) Memristor—the missing circuit element. IEEE

Trans Circuit Theory 18(5):507–519

6. Strukov DB, Snider GS, Stewart DR, Williams RS (2008) The

missing memristor found. Nature 453:80–83

7. Jo SH, Chang T, Ebong I, Bhadviya BB, Mazumder P, Lu W

(2010) Nanoscale memristor device as synapse in neuromorphic

systems. Nano Lett 10:1297–1301

8. Lehtonen E, Laiho M (2010) CNN using memristors for neigh-

borhood connections. In: 12th international workshop on cellular

nanoscale networks and their applications CNNA 2010, pp 1–4

9. Laiho M, Lehtonen E (2010) Cellular nanoscale network cell with

memristors for local implication logic and synapses. In: Pro-

ceedings of 2010 IEEE international symposium on circuits and

systems ISCAS, pp 2051–2054

10. Kim H, Sah MP, Yang C, Roska T, Chua LO (2012) Memristor

bridge synapses. Proc IEEE 100(6):2061–2070

11. Satyanarayana S, Tsividis Y, Graf HP (1989) Analogue neural

networks with distributed neurons. Electron Lett 25(5):302

12. Satyanarayana S, Tsividis YP, Graf HP (1989) A reconfigurable

VLSI neural network. In: 1989 Proceedings of international

conference on wafer scale integration, vol 27, pp 67–81

13. Biolek Z, Biolek D, Biolkova V (2009) SPICE model of mem-

ristor with nonlinear dopant drift. Radioengineering 18:210–214

14. Joglekar YN, Wolf SJ (2008) The elusive memristor: signatures

in basic electrical circuits. Physics 30(4):1–22

15. Djahanshahi H, Ahmadi M, Jullien GA, Miller WC (1996) A

unified synapse-neuron building block for hybrid VLSI neural

networks. In: 1996 IEEE international symposium on circuits and

systems. Circuits and systems connecting the world. ISCAS 96,

vol 3, pp 483–486

16. Khodabandehloo G, Mirhassani M, Ahmadi M (2012) Analog

implementation of a novel resistive-type sigmoidal neurons.

IEEE Trans Very Large Scale Integr (VLSI) Syst 20(4):750–754

17. Blake CL, Merz CJ (1998) UCI machine learning repository.

University of California, Irvine, School of Information and

Computer Sciences [Online]. http://archive.ics.uci.edu/ml

18. Draghici S (2000) Neural networks in analog hardware—design

and implementation issues. Int J Neural Syst 10:19–42

19. Aggarwal A, Hamilton B (2012) Training artificial neural net-

works with memristive synapses: HSPICE-matlab co-simulation.

In: 11tneuron’’h symposium on neural network applications in

electrical engineering, pp 101–106

20. Linn E, Siemon A, Waser R, Menzel S (2014) Applicability of

Well-Established Memristive Models for Simulations of Resis-

tive Switching Devices. IEEE Transactions on Circuits and

Systems I 20(4):750–754

21. Pickett MD, Strukov DB, Borghetti JL, Yang JJ, Snider GS,

Stewart DR, Williams RS (2009) Switching dynamics in titanium

dioxide memristive devices. J Appl Phys 106(7):074508

22. Kvatinsky S, Friedman EG, Kolodny A, Weiser UC (2013)

TEAM: threshold adaptive memristor model. IEEE Trans Cir-

cuits Syst I Regul Papers 60(1):211–221

23. Kvatinsky S, Talisveyberg K, Fliter D, Friedman EG, Kolodny A,

Weiser UC (2012) Models of memristors for SPICE simulations.

In: Proceedings of the IEEE convention of electrical and elec-

tronics engineers in Israel, pp 1–5

Table 4 Simulation results of the trained ANN in HSPICE�

Features Value

Number of modular neurons 45

Mean square error 0.0695

Correct classification 86.7 %

Number of training epoch 38

Neural Comput & Applic

123

View publication statsView publication stats