[ieee 2010 international conference on artificial intelligence and computational intelligence (aici)...

5
Quantum Self-Organization Feature Mapping Networks with Application Panchi Li School of Computer & Information Technology Northeast Petroleum University Daqing, Heilongjiang, China E-mail: [email protected] Kaoping Song, Erlong Yang School of Petroleum Engineering Northeast Petroleum University Daqing, Heilongjiang, China E-mail: [email protected] Abstract—A quantum self-organization feature mapping networks model based on quantum neurons is presented in this paper. Both the input and the weight of the model are represented by the quantum bits, and the output of the model is represented by the real number. The model is composed of input layer and competitive layer. First, the samples are transformed into quantum states and are submitted to the input layer, and then the similar coefficients of quantum states are computed between the inputs and the weights. Secondly, the implicit pattern characters of the clustering samples are extracted in the competitive layer, and then the clustering results are showed. The quantum states of weights are updated by quantum rotation gates. The networks are trained by the algorithm combining the unsupervised learning and supervised learning together. Finally two experiments demonstrate that the model and algorithm are evidently superior to the general self-organization feature mapping networks. Keywords- quantum computing; quantum neuron; quantum self-organization networks; quantum clustering algorithm I. INTRODUCTION Quantum computation is a novel inter-discipline subject that includes quantum mechanics and information science. Since P.W. Shor gave the first quantum algorithm of very large integer factorization in 1994 and L.K. Grover proposed a quantum algorithm which searches a marked state in an unordered list in 1996, quantum computation has been widely paid attention and become a challenging research line. Fuzzy logic, evolution calculation, and neural networks are regard as the most promising three important aspects in the artificial intelligence field, which compose intelligence calculation (soft-calculation) and exists much comparability with quantum calculation. Therefore, the amalgamation of them would bring promising research line in theory. At present, elementary amalgamation of quantum calculation and evolution calculation has already existed and gained some valuable research results [1-3], however, as yet, there is little understanding of the essential components of artificial neural networks (ANNs) based on quantum theoretical concepts and techniques. Quantum perceptrons were proposed by Lewestein [4], where instead of classical weights in the perceptron a unitary operator is used to map inputs to outputs. During training the unitary operator is developed to find the correct mapping. The use of quantum theoretical techniques in ANNs for cognitive modeling has also been suggested [5] but these are predominantly theoretical proposals arguing the need for quantum neural computing. Quantum learning was proposed by Chrisley. While practical concerns are outlined for the implementation of such a process, he does not test the success of quantum learning by simulation. Menneer and Narayanan [6,7] have extended Chrisley's design in their proposal for single pattern quantum neural networks. Recently, Ezhov [8] argues that current approaches to quantum neural networks based on such single pattern networks are consistent with the Everett parallel universe interpretation of quantum mechanics. Overall, it therefore appears that, apart from Menneer and Narayanan [6,7], no experimentation or simulation has yet been done on quantum neural networks to determine their architecture and function or even their power over their classical counterparts [9-11]. The research presented in this paper (1) constructs quantum neuron and quantum self-organization feature mapping neural network model, (2) proposes an algorithm for this model, and (3) designs two experiments, the results demonstrate the model and the algorithm are efficient. II. QUANTUM SELF-ORGANIZATION NETWORKS A. Quantum Neuron Model A neuron can be described as a four-dimensional array of inputs, weights, transform function, and outputs, where the inputs and outputs are the outer attributes of the neuron, and the weights and transform function are the inner attributes of the neuron. Therefore, the different neuron models can be constructed by modifying types of weights and transform function. According this viewpoint, for the quantum neuron proposed in this paper, the inputs and the weights are represented by qubits, and the transform function is represented by linear operators. At the same time, the difference from the traditional neuron is that the quantum neuron carries a group of single-bit quantum gates that modify the phases of qubits of weights. The quantum neuron model is presented in Fig.1. In quantum neuron model, the inputs and the weights are represented by qubits i x | and i φ | , respectively. A qubit can be described as follow + = 1 | 0 | | i i i β α φ , (1) 2010 International Conference on Artificial Intelligence and Computational Intelligence 978-0-7695-4225-6/10 $26.00 © 2010 IEEE DOI 10.1109/AICI.2010.61 257 2010 International Conference on Artificial Intelligence and Computational Intelligence 978-0-7695-4225-6/10 $26.00 © 2010 IEEE DOI 10.1109/AICI.2010.61 257

Upload: erlong

Post on 23-Dec-2016

224 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: [IEEE 2010 International Conference on Artificial Intelligence and Computational Intelligence (AICI) - Sanya, China (2010.10.23-2010.10.24)] 2010 International Conference on Artificial

Quantum Self-Organization Feature Mapping Networks with Application

Panchi Li School of Computer & Information Technology

Northeast Petroleum University Daqing, Heilongjiang, China

E-mail: [email protected]

Kaoping Song, Erlong Yang School of Petroleum Engineering Northeast Petroleum University

Daqing, Heilongjiang, China E-mail: [email protected]

Abstract—A quantum self-organization feature mapping networks model based on quantum neurons is presented in this paper. Both the input and the weight of the model are represented by the quantum bits, and the output of the model is represented by the real number. The model is composed of input layer and competitive layer. First, the samples are transformed into quantum states and are submitted to the input layer, and then the similar coefficients of quantum states are computed between the inputs and the weights. Secondly, the implicit pattern characters of the clustering samples are extracted in the competitive layer, and then the clustering results are showed. The quantum states of weights are updated by quantum rotation gates. The networks are trained by the algorithm combining the unsupervised learning and supervised learning together. Finally two experiments demonstrate that the model and algorithm are evidently superior to the general self-organization feature mapping networks.

Keywords- quantum computing; quantum neuron; quantum self-organization networks; quantum clustering algorithm

I. INTRODUCTION Quantum computation is a novel inter-discipline subject

that includes quantum mechanics and information science. Since P.W. Shor gave the first quantum algorithm of very large integer factorization in 1994 and L.K. Grover proposed a quantum algorithm which searches a marked state in an unordered list in 1996, quantum computation has been widely paid attention and become a challenging research line. Fuzzy logic, evolution calculation, and neural networks are regard as the most promising three important aspects in the artificial intelligence field, which compose intelligence calculation (soft-calculation) and exists much comparability with quantum calculation. Therefore, the amalgamation of them would bring promising research line in theory. At present, elementary amalgamation of quantum calculation and evolution calculation has already existed and gained some valuable research results [1-3], however, as yet, there is little understanding of the essential components of artificial neural networks (ANNs) based on quantum theoretical concepts and techniques. Quantum perceptrons were proposed by Lewestein [4], where instead of classical weights in the perceptron a unitary operator is used to map inputs to outputs. During training the unitary operator is developed to find the correct mapping. The use of quantum theoretical techniques in ANNs for cognitive modeling has

also been suggested [5] but these are predominantly theoretical proposals arguing the need for quantum neural computing. Quantum learning was proposed by Chrisley. While practical concerns are outlined for the implementation of such a process, he does not test the success of quantum learning by simulation. Menneer and Narayanan [6,7] have extended Chrisley's design in their proposal for single pattern quantum neural networks. Recently, Ezhov [8] argues that current approaches to quantum neural networks based on such single pattern networks are consistent with the Everett parallel universe interpretation of quantum mechanics. Overall, it therefore appears that, apart from Menneer and Narayanan [6,7], no experimentation or simulation has yet been done on quantum neural networks to determine their architecture and function or even their power over their classical counterparts [9-11].

The research presented in this paper (1) constructs quantum neuron and quantum self-organization feature mapping neural network model, (2) proposes an algorithm for this model, and (3) designs two experiments, the results demonstrate the model and the algorithm are efficient.

II. QUANTUM SELF-ORGANIZATION NETWORKS

A. Quantum Neuron Model A neuron can be described as a four-dimensional array of

inputs, weights, transform function, and outputs, where the inputs and outputs are the outer attributes of the neuron, and the weights and transform function are the inner attributes of the neuron. Therefore, the different neuron models can be constructed by modifying types of weights and transform function. According this viewpoint, for the quantum neuron proposed in this paper, the inputs and the weights are represented by qubits, and the transform function is represented by linear operators. At the same time, the difference from the traditional neuron is that the quantum neuron carries a group of single-bit quantum gates that modify the phases of qubits of weights. The quantum neuron model is presented in Fig.1.

In quantum neuron model, the inputs and the weights are represented by qubits ⟩ix| and ⟩iφ| , respectively. A qubit can be described as follow

⟩+⟩=⟩ 1|0|| iii βαφ , (1)

2010 International Conference on Artificial Intelligence and Computational Intelligence

978-0-7695-4225-6/10 $26.00 © 2010 IEEE

DOI 10.1109/AICI.2010.61

257

2010 International Conference on Artificial Intelligence and Computational Intelligence

978-0-7695-4225-6/10 $26.00 © 2010 IEEE

DOI 10.1109/AICI.2010.61

257

Page 2: [IEEE 2010 International Conference on Artificial Intelligence and Computational Intelligence (AICI) - Sanya, China (2010.10.23-2010.10.24)] 2010 International Conference on Artificial

Competitive layer

Input layer

┅ 11U 12U nU1┅

f

⟩1| x ⟩2| x ⟩ix| ⟩nx|

1mU 2mU mnU┅

f

1y my

⟩X|

⟩W|

where iα and iβ are complex numbers, 2|| iα and 2|| iβ give the probabilities that the qubit ⟩iφ| will respectively be found in the quantum basis state of ⟩0| and ⟩1| , and satisfy the following normalization conditions

1|||| 22 =+ ii βα . (2)

The number iα and iβ satisfied (2) are called the probability amplitudes of the corresponding states of qubit. Therefore, the qubit can also be described by the probability amplitudes as T],[ ii βα .

Figure 1. Quantum Neuron Model.

Let T21 ]|,,|,[| ⟩⟩⟩= nxxxX

TTTT ]],[,,],[,],[[2211 nn xxxxxx βαβαβα= , T

21 ]|,,|,[| ⟩⟩⟩= nwwwW TTTT ]],[,,],[,],[[

2211 nn wwwwww βαβαβα= . The in-out relation of the quantum neuron can be described as follow

∑∑==

+=⟩⟨=⟩⟨=n

ixwxw

n

iii iiii

xwfy11

)(|)|( ββααXW , (3)

where f is a linear operator. This operator, as transform function, transforms the inputs of quantum neuron to a real number. iU is a quantum rotation gate to modify the phase of the ⟩iw| .

B. Quantum Self-Organization Feature Mapping Networks Model Quantum self-organization feature mapping neural

networks (QSOFMNN) is the two layers networks that include input layer and competition layer. The difference from the general SOFMNN is that the QSOFMNN is composed of quantum neurons described in Fig.1. The QSOFMNN model is shown in Fig.2, where the input layer and the competition layer includes n and m quantum neurons, respectively. The in-out relation of the QSOFMNN can be described as follow

∑=

⟩⟨=⟩⟨=n

iijijj xwfy

1

|)|( XW , (4)

where ni ,,2,1= , mj ,,2,1= .

Figure 2. Quantum Self-Organization Feature Mapping Neural Networks.

III. THE QSOFMNN CLUSTERING ALGORITHM

Suppose the clustering samples set is },,,{ 21 PXXX ,

where T21 ],,,[ k

nkkk xxx=X , Pk ,,2,1= . All samples

come from d different patterns. Let jM ( dj ,,2,1= )is the sample set attaches to pattern j , and jD ( dj ,,2,1= ) is the serial number set of the victorious competition neurons for the samples in jM .

A. The Quantum Description of the Clustering Samples For the clustering samples described in the n -dimension

Euclid-space T21 ],,,[ nxxxX = , the transform formula is

defined as follow

T21 ]|,,|,|[| ⟩⟩⟩=⟩ nxxxX , (5)

where ⟩⎟⎠⎞

⎜⎝⎛

++⟩⎟

⎠⎞

⎜⎝⎛

+=⟩ −− 1|

12sin0|

12cos|

ii xxi eex ππ .

Applying (5), the clustering samples can be transformed from real vectors to quantum states.

B. Competition Learning Rules

Definition: Suppose T21 ]|,,|,|[| ⟩⟩⟩=⟩ nxxxX and

T21 ]|,,|,[|| ⟩⟩⟩=⟩ nyyyY are n-dimension quantum state

vectors. The similarity coefficient between ⟩X| and ⟩Y| is

defined as ∑= ⟩⟨⟩⟨

⟩⟨=⟩⟨=n

i iiii

ii

yyxxyxr

1 ||||YX .

According to the definition, the similarity coefficient between the k-th input sample ⟩kX| and the j-th weight

⟩jW| can be described as follow

1U 2U nU┅

f

y

⟩nx| ⟩2| x ⟩nx|

⟩1| w ⟩2| w ⟩nw|

258258

Page 3: [IEEE 2010 International Conference on Artificial Intelligence and Computational Intelligence (AICI) - Sanya, China (2010.10.23-2010.10.24)] 2010 International Conference on Artificial

∑= ⟩⟨⟩⟨

⟩⟨=⟩⟨=

n

i jijikiki

jikij

kkj wwxx

wxr

1 ||

|| WX . (6)

For k-th input sample ⟩kX| , If the code *j in the competition layer is the victorious neuron, namely,

}{max},,2,1{

*kj

mj

kj rr

∈= , (7)

then update the ⟩*| jW and make ⟩*| jW close up to ⟩kX| .

Hence, the node *j can represent the pattern of the ⟩kX| .

C. QSOFMNN Clustering Algorithm The general SOFMNN usually adopts non-supervision

algorithm, which the clustering results are d groups competition nodes in the competition layer. Namely, a center node and the nodes in a certain neighborhood of it together represent a sort of pattern. To make clustering results precise and exclusive, a updating algorithm of quantum weights is proposed, which includes two phases, namely, the non-supervision (step1-6) and supervision (step7-10). Applying this algorithm, each kind of pattern can correspond to the only victorious neuron in the competition layer.

Step1. Initialize the quantum weights.

}{max},,2,1{

*kj

mj

kj rr

∈= , (8)

where ⟩+⟩=⟩ 1|)sin(0|)cos(| jijijiw θθ , rnd2 ×= πθ , rnd represents a random number in (0,1).

Step2. Initialize the maximum iterations Max , the initial learning rate 0a , and the initial neighborhood radius 0r . Set the current number of iterations 0=s .

Step3. Compute the learning rate and the neighborhood radius according to (9) and (10).

)Max/1()( 0 sasa −= , (9) ⎡ ⎤)Max/1()( 0 srsr −= . (10)

Step4. For each sample in the training sample set, compute the serial number of victorious neuron *j .

Step5. In the neuron array of competition layer, determine the neighborhood of the neuron *j , update the weights according to (11).

⎪⎩

⎪⎨⎧

∉⟩

∈⟩⟩=⟩+

))(,(,)(|

))(,(,])(|,,)(|[)1(|

*

*T11

srjjs

srjjswsws

j

jnjnjjj

ψ

ψ

W

UUW , (11) (10)

where ⎥⎥⎦

⎢⎢⎣

⎡ −=

))(cos())(sin(

))(sin())(cos(

jiji

jijiji sasa

sasa

θθθθ

U

⎟⎟

⎜⎜

⟩⟨⟩⟨

⟩⟨⎟⎟

⎜⎜

⎛−=

jijikiki

jiki

wx

wxji wwxx

wx

jiki

jiki

||

|arccossgn

ββ

ααθ ,

kixα , kixβ and

jiwα , jiwβ are probability altitudes of

⟩kix| and ⟩jiw| , respectively. Step6. Max<s ? Yes, 1+= ss , go to step3. No, 0=s ,

go to step7. Step7. For each pattern set jM ( dj ,,2,1= ),

determine the center sample ⟩*| jX according to (12) and (13),

∑=

⟩=⟩jn

ii

jj n 1

|1| XX ; ji M∈⟩X| ; jj Mn = , (12)

⟩⟨=⟩⟨∈

jini

jjj

XXXX |max|},,2,1{

* . (13)

Step8. Compute learning rate by (14),

)Max/1()( 0 sasa −= . (14)

Step9. For each pattern set jM ( dj ,,2,1= ),let the

serial number of victorious neuron is *jd that correspond to

the center sample, update weights according to (15), where θ is a certain small positive number.

⎪⎪⎪⎪⎪

⎪⎪⎪⎪⎪

≥⟩⟨−⟩⟨∈≠

<⟩⟨−⟩⟨∈≠

⟩⟩

∈=

⟩⟩

=⟩+

−−

++

θ

θ

idjj

i

idjj

ininii

jj

ininii

i

j

j

Didi

s

Didi

swsw

Mdi

swsw

s

WXWX

W

WXWX

UU

X

UU

W

||;;

)(|

||;;

])(|,,)(|[

;

])(|,,)(|[

)1(|

*

*

*

*

T11

*

T11

,(15)

where

⎥⎥⎦

⎢⎢⎣

⎡ −=

±±

±±±

))(cos())(sin(

))(sin())(cos(

ikik

ikikik

sasa

sasa

θθ

θθU ,

⎟⎟

⎜⎜

⟩⟨⟩⟨

⟩⟨⎟⎟

⎜⎜

⎛=±

ikikkk

ikk

wx

wxik wwxx

wx

ikk

ikk

|||arccossgn

ββαα

θ ∓ ,

kxα ,kxβ and

ikwα ,ikwβ are probability altitudes of

⟩kx| and ⟩ikw| , respectively. Step10. Max<s ? Yes, 1+= ss , go to step7. No, save

the results, stop.

259259

Page 4: [IEEE 2010 International Conference on Artificial Intelligence and Computational Intelligence (AICI) - Sanya, China (2010.10.23-2010.10.24)] 2010 International Conference on Artificial

0 0.2 0.4 0.6 0.8 1-5

0

5

10

15

D. QSOFMNN Classification Algorithm After the clustering process is finished, for a certain

sample X to be recognized, suppose the j*-th competition neuron is victorious, then the pattern of sample X can be determined by the following steps.

Step1. If },,,{~~

* **2

*1 ddddddj ∈= , , then X falls into

the pattern represented by the competition neuron d~

. Step2. If },,,{* **

2*1 ddddj ∉ , then compute the

classification result by (16) so as to get the victorious neuron d~

that belongs to },,,{ **2

*1 dddd and mostly close with

*j .

θ<⟩⟨⟩⟨=⟩⟨∈

*~*},,,{

*~ |and|max|**

2*1

jdjjdddj

jdd

WWWWWW , (16)

where θ is a certain small positive number. Then, X may fall into the pattern represented by the competition neuron d

~.

Step3. If X can not fall into the any pattern existed at present, then X falls into the unknown pattern.

IV. SIMULATION COMPARISONS

A. The IRIS Samples Clustering In this simulation, the famous IRIS sample set is

employed. All samples in the IRIS are four-dimension data, and fall into three kinds of pattern: setosa, versicolor, virginica. Each of patterns contains 50 samples. The networks parameters are presented in Table 1. Take the first 40 samples in each pattern as training set to train networks and get the pattern information, and take the last 10 samples in each pattern as testing set to test the classification capability of the proposed algorithm. The experimental results are shown in Table 2.

TABLE I. THE PARAMETERS OF QSOFMNN AND SOFMNN

Input codes

Competition codes

Non-supervision Supervision

Max. of steps

Learning rate

Radius of neighborhood

Max. of steps

Learning rate

4 100 300 1.0 3 100 0.5

TABLE II. THE SIMULATION RESULTS COMPARISON

Algorithm Clustering Classification

Right number

Right ratio

Right number

Right ratio

SOFMNN 107 89.2% 27 90.0% QSOFMNN 115 95.8% 29 96.7%

B. The Intersecting Samples Clustering The intersecting data set with linear prototype, shown in

Fig.3, is employed in this simulation. 1C : ε+= 11 10xy , 10 1 ≤≤ x , 2C : ε+−= 22 1010 xy ,

10 2 ≤≤ x .

where the data noise ),0(~ 2σε N , 1.0=σ , the sampling interval 01.0=Δx . First, generate 101 samples data as the training set, and then generate another 101 samples data as the testing set. Since 1C and 2C interval near 5.0=x , when

]6.0,4.0[∈x , both results of clustering and classifying are regarded as right. The network parameters are presented in Table 3, and the experimental results comparison are shown in Table 4.

Figure 3. Scatter clustering plots of experiment 2.

TABLE III. THE PARAMETERS OF QSOFMNN AND SOFMNN

Input codes

Competitioncodes

Non-supervision Supervision

Max. of steps

Learning rate

Radius of neighborhood

Max. of steps

Learning rate

2 64 300 1.0 2 100 0.5

TABLE IV. THE SIMULATION RESULTS COMPARISON

Algorithm Clustering Classification

Right number

Right ratio

Right number

Right ratio

SOFMNN 89 88.1% 81 80.2% QSOFMNN 95 94.1% 87 86.1%

In conclusion, it can be seen from two simulations that

the QSOFMNN is superior to the general SOFMNN in both right ratio of clustering and classifying as a result of adding the quantum computation to the QSOFMNN. The powerful searching capability makes the QSOFMNN hold the better performance in both gathering up pattern and generalizing it.

V. CONCLUSIONS A novel QSOFMNN is proposed in this paper. By

defining the similarity coefficient between the quantum inputs and quantum weights, the clustering algorithm is introduced. The proposed algorithm includes two phase of non-supervision and supervision. First, the clustering samples are transformed to quantum vectors by the transform formula, then the samples are submitted the QSOFMNN to perform clustering. Applying the pattern information

260260

Page 5: [IEEE 2010 International Conference on Artificial Intelligence and Computational Intelligence (AICI) - Sanya, China (2010.10.23-2010.10.24)] 2010 International Conference on Artificial

acquired in the process of clustering, The QSOFMNN can accomplish classification. The experimental results show that the QSOFMNN model and algorithm is efficient.

ACKNOWLEDGMENT This paper was supported by China Postdoctoral Science

Foundation(Grant No. 20090460864), Heilongjiang Province Postdoctoral Science Foundation of China (Grant No. LBH-Z09289), Scientific Research Foundations of Heilongjiang Provincial Education Department (Grant No. 11551015 and 11551017).

REFERENCES [1] A. Narayanan, M. Moore. Quantum inspired genetic algorithms. Proc.

of the 1996 IEEE International Conference on Evolutionary Computation (ICEC96). Nogaya: IEEE Press, 1996. 41-46.

[2] K.H. Han. Genetic quantum algorithm and its application to combinatorial optimization problem. IEEE Proc of the 2000 Congress on Evolutionary Computation. San Diego: IEEE Press, 2000. 1354-1360.

[3] J.A. Yang. Research of quantum genetic algorithm and its application in blind source separation. ACTA Electronica Sinica (China), 2003, 20(1): 62-68.

[4] M. Lewestein. Quantum perceptrons. Journal of Modern Optics, 1994, 41(12): 2491-2501.

[5] S.C. Kak. Quantum neural computing. Advances in Imaging and Electron Physics, 1995, 94: 259-314.

[6] T. Menneer, A. Narayanan. Quantum inspired neural networks. Department of Computer Science, University of Exeter, UK, http://www.dcs.ex.ac.uk/reports/reports.html, 1995.

[7] T. Menneer, A. Narayanan. Quantum artificial neural networks vs classical artificial neural networks: experiments in simulation. Proc. of the 5th Joint Conference on Information Sciences, 2000, 1: 757-759.

[8] A. Ezhov. Spurious memory, single-class and quantum neural networks. Proc. of the Fifth Joint Conference on Information Sciences, 2000, 1: 767-770.

[9] A.Narayanan, T.Menneer. Quantum artificial neural network archi-tectures and components. Information Sciences, 2000, 128: 231-255.

[10] Li P.C., Li S.Y.: Learning algorithm and application of quantum BP neural networks based on universal quantum gates, Journal of Systems Engineering and Electronics, 2008, 19(1), pp.167-174.

[11] Shafee F.: Neural networks with quantum gated nodes, Engineering Applications of Artificial Intelligence, 2007, 20(4), pp.429-437.

261261