project name pattern recognition by using neural network · 2013-06-17 · project name pattern...

30
1 Republic of Iraq Ministry of Higher Education And Scientific Research Baghdad University College of Science Project Name Pattern recognition by using neural network A Project Report Submitted to the College of Science, Baghdad University in Partial Fulfillment of the Requirements for the BSc Degree of Science in Computer Science BY Suha Salman Hussein SUPERVIED BY LECTURAL Dr. Asmaa Qasim Shareef 2011-2012

Upload: others

Post on 09-Apr-2020

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

1

Republic of Iraq

Ministry of Higher Education

And Scientific Research

Baghdad University

College of Science

Project Name

Pattern recognition by using neural network

A Project Report Submitted to the College of Science, Baghdad

University in Partial Fulfillment of the Requirements for the

BSc Degree of Science in Computer Science

BY

Suha Salman Hussein

SUPERVIED BY

LECTURAL

Dr. Asmaa Qasim Shareef

2011-2012

Page 2: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

2

حين بسن هللا حوي الز الز

الذيي آهىا هكن والذيي يزفع هللا أوحىا

بوا العلن درجاث وهللا

حعولىى خبيز ( 11)سىرة الوجادلت

Page 3: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

3

األهذاء

ولم هكن لنصل إليه لى ال فضل هللا علينا أما بعد:الحمد هلل الري وفقنا لهرا

ن حفظهما هللا لي ص أهدي هرا العمل املحىاضع الى أمي و أبي العص

ران سهسا وجعبا على جعليمي في إثمام هرا العمل الل

إلى كل ألاصدقاء و ألاحباب من دون اسحثناء

إلى أساثرجي الكسام و كل زفقاء الدزاسة

من هللا جعالى أن جعل عملي هرا هفعا سحفيد منه حميع الطلبة وفي ألاخير أزحىا

املقبلين على الحخسج

Page 4: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

4

شكـــــــــــش وتقذيـــــــــــــش

الحمد هلل الري أهاز لنا دزب العلم واملعسفة وأعاهنا على أداء هرا الىاحب

ووفقنا إلى اهجاش هرا العمل

ب أو من بعيد ل الشكس والامحنان إلى كل من ساعدها من قس هحىحه بجص

على اهجاش هرا العمل وفي ثرليل ما واحهناه من صعىبات، وهخص بالركس

ف ألاسحاذ املشسف الري لم بخل علينا بحىحيهاثه د.أسماء قاسم شس

.وهصائحه القيمة التي كاهت عىها لنا في إثمام هرا البحث

تأييذ انششف

Page 5: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

5

جاهعت /كليت العلىم/اؤيذ اى الوشزوع الوىسىم قذ اعذ ححج اشزافي في قسن علىم الحاسباث

غذاد كجزء هي هخطلباث الحصىل عل درجت البكلىريىس في علىم الحاسباث.ب

اسن الوشزف:

ع:ـــــــــالخىقي

اريخ:ــــــــالخ

_____________________________________________________________

تأييذ نجت اناقشت

ارــــــن اخخبــاقشت وقذ حــت الوــاء لجــزوع الوسىم كأعضـ الوشـا اطلعا علـذ اـإي

زوعــــــــىل الوشــه ىصي بقبــىي الخقزيز وعليـيي في هحخــاى حسـالطالبت سه سلو

كبحث للخخزج للحصىل علل درجت البكلىريىس في علىم الحاسباث.

االسن والخىقيع االسن والخىقيع االسن والخىقيع االسن والخىقيع

Abstract

Page 6: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

6

A Neural network is a machine that is designed to model the way in which

the brain performs a particular task or function of interest: The network is

usually implemented by using electronic components or is simulated in software

on a digital computer. “A neural network is a massively parallel distributed

processor made up of simple processing units which has a natural propensity for

storing experiential knowledge and making it available for use. It resembles the

brain in two respects:

1) Knowledge is required by the network from its environment through a

learning process.

2) Interneuron connection strengths, known as synaptic weights, are used to test

the acquired knowledge”.

In this paper, we proposed a system capable of recognizing characters or

symbols, inputted by the drawing board. The system provides means for

training the input characters first, then the patterns or symbols already for

training, in order to recognize it.

Contents

Chapter1

Page 7: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

7

1.1 Introduction.

1.2 Neural network (NN).

1.2.1 Network layers.

1.3 Uses of neural network.

1.4 Back propagation (BP).

Chapter2 2.1 Using backpropagation in a neural network.

2.2 The algorithm.

2.3 The training works.

2.4 Running the algorithm.

2.5 End of training.

Chapter3

3.1 The program interface.

3.2 How to run the program?

Chapter4

4.1Conclusions and recommendations.

Reference

Chapter 1

1.1 Introduction:

Page 8: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

8

Among the various traditional approaches of pattern recognition the

statistical approach has been most intensively studied and used in practice.

More recently, the addition of artificial neural network techniques theory has

been receiving significant attention.

The design of a recognition system requires careful attention to the following

issues: definition of pattern classes, sensing environment, pattern

representation, feature extraction and selection, cluster analysis, classifies

design and learning, selection of training and test samples, and performance

evaluation. New and emerging applications, such as data mining, retrieval of

multimedia data, face recognition, and cursive handwriting recognition, require

robust and efficient pattern recognition techniques.

The processing of information in a computer is done principally in a different

way than in a human or animal brain. Whereas the computer is strongly

influenced by the Von Neumann Architecture, the brain is gigantic parallel

processing system? Billions of processors, the neurons, process the incoming

stimulant according to the immanent processing patterns. The processing speed

of a snail compared to that of a car. For that matter, the number of processors in

a brain has not yet been achieved by the computers of today.

This basic difference leads to the fact computer and brain has their own

strengths and weaknesses in applications. The superiority of a computer in

processing numbers and doing mathematical calculations is obvious. On the

other hand there are activities of the brain, which cannot be achieved with

computers. As simple example consider the recognition of a person. One can

feed the physiognomy of a face in the computer in form of bit patterns. But

when this person appears before the computer in a slightly changed form or

even when a photographic produced, the computer will not recognize him.

Hence it is imminent to make an attempt to build computers, which copy the

activities of the brain. This was also the basic idea behind the conception of

neural networks.

1.2 Neural network (NN):

Neural networks artificial or what is also called networks neural simulation

or SNN, a coherent set of neurons virtual create computer programs to resemble

the work of neuron diversity or built electronic designed to mimic the action of

neurons using the mathematical model of information processing based on the

Page 9: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

9

method of communication in computing. Consist of neural networks in general,

simple processing elements of the work is simple but the overall behavior of the

network is determined by communication between these various elements

which are called here Neurons these elements and indicators of element

parameters.

I suggest the idea of neural networks came from the mechanism of action of

brain neurons that can be likened to biological electrical networks to process

information to the brain. In these networks suggested Donald suppose neural

synapse plays a key role in guiding the treatment process and this led to think

about the idea of connectivity and artificial neural networks. Consists neural

networks artificial than a decade, or what have mentioned previously that

neurons or processing units, connected together to form a network of nodes, all

communication between these nodes have a set of values called weights

contribute in determining the values resulting from each processing component

based on the values within this element.

1.2.1 Network layers:

The commonest type of artificial neural network consists of three groups, or

layers, of units: a layer of "input" units is connected to a layer of "hidden"

units, which is connected to a layer of "output" units (see Figure 1.1).

Figure 1.1 an example of a simple feedforward network

Page 10: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

10

The activity of the input units represents the raw information that is fed

into the network.

The activity of each hidden unit is determined by the activities of the

input units and the weights on the connections between the input and the

hidden units.

The behavior of the output units depends on the activity of the hidden

units and the weights between the hidden and output units.

The weight parameters in the neural network represent the weighted

connections between neurons. The network trying to find the best weights

depending on the input and output values (sometimes input only) by training it.

After training the neural network, the weighted connections capture/encode the

problem information from the raw training data.

1.3 Uses of neural network: - Artificial Intelligence.

- Zoom functions.

- Identification of the persons.

-Identify the positions.

-Voice recognition, or image, etc.

-Identify the fonts and handwriting.

-Control.

-Simulation systems.

-Modeling.

-Filter.

Page 11: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

11

1.4 Back propagation (BP):

Back propagation is one of the methods of teaching neural networks, which

provides the transfer of information to spread to the opposite direction for the

arrival of the original information.

Adopt this method on the principle of education observer and you need at

the stage of training to the specific data to learn the network, providing data are

entered (input) with the data output (output) desired and then have the network

work spread in front of me (feed forward) for income data for value of the

output network then the comparison between the output calculated and output

desired if that does not match the results that the network calculates the value of

the difference between each neuron of layer output, which represents the error

value ,then comes the process of deploying the back of the errors

(backpropagation), where re-Network account value error in each neuron of the

hidden networks. Finally comes the stage of the value of updated weights

(weight update) where the networks re-calculate all the weights and offset

values calculated new.

Chapter2

2.1 Using backpropagation in a neural network:

Backpropagation is a technique discovered by Rumelhart, Hinton and

Williams in 1986 and it is a supervised algorithm that learns by first computing

the output using a feedforward network, then calculating the error signal and

propagating the error backwards through the network.

2.2 The algorithm:

Most people would consider the Back Propagation network to be the

quintessential Neural Net. Actually, Back Propagation is the training or learning

algorithm rather than the network itself. The network used is generally of the

simple type .and it’s called Feed-Forward Networks or occasionally Multi-

Layer Perceptrons (MLPs).

Page 12: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

12

A Back Propagation network learns by example. You give the algorithm

examples of what you want the network to do and it changes the network's

weights so that, when training is finished, it will give you the required output

for a particular input. Back Propagation networks are ideal for simple Pattern

Recognition. As just mentioned, to train the network you need to give it

examples of what you want the output you want (called the Target) for a

particular input as shown in Figure 2.1.

Figure 2.1, a Back Propagation training set.

Page 13: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

13

So, if we put in the first pattern to the network, we would like the output to be

0 1 as shown in figure 2.2 (a black pixel is represented by 1 and a white by 0 as

in the previous examples). The input and its corresponding target are called a

Training Pair.

Figure 2.2, applying a training pair to a network.

Once the network is trained, it will provide the desired output for any of the input

Patterns.

2.3 the training works:

The network is first initialized by setting up all its weights to be small random

numbers between –1 and +1. Next, the input pattern is applied and the output

calculated (this is called the forward pass). The calculation gives an output which

is completely different to what you want (the Target), since all the weights are

random. We then calculate the Error of each neuron, which is essentially: Target

– Actual Output (i.e. what you want – What you actually get). This error is then

used mathematically to change the weights in such a way that the error will get

smaller. In other words, the Output of each neuron will get closer to its Target (this

part is called the reverse pass). The process is repeated again and again until the

error is minimal.

Page 14: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

14

-an example with an actual network to see how the process works:

Figure 2.3, all the calculations for a reverse pass of Back Propagation.

1. Calculate errors of output neurons:

δα = outα (1 - outα) (Targetα - outα)

δβ = outβ (1 - outβ) (Targetβ - outβ)

2. Change output layer weights:

W+Aα = WAα + ηδα outA W+Aβ = WAβ + ηδβ outA

W+Bα = WBα + ηδα outB W+Bβ = WBβ + ηδβ outB

W+Cα = WCα + ηδα outC W+Cβ = WCβ + ηδβ outC

Page 15: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

15

3. Calculate (back-propagate) hidden layer errors:

δA = outA (1 – outA) (δαWAα + δβWAβ)

δB = outB (1 – outB) (δαWBα + δβWBβ)

δC = outC (1 – outC) (δαWCα + δβWCβ)

4. Change hidden layer weights:

W+λA = WλA + ηδA inλ W+ΩA = W+ΩA + ηδA inΩ

W+λB = WλB + ηδB inλ W+ΩB = W+ΩB + ηδB inΩ

W+λC = WλC + ηδC inλ W+ΩC = W+ΩC + ηδC inΩ

The constant η (called the learning rate, and nominally equal to one) is put in to

speed up or slow down the learning if required.

2.4 Running the algorithm: Now that we’ve seen the algorithm in detail, let’s look at how it’s run with a

large data set. Suppose we wanted to teach a network to recognize the first four

letters of the alphabet on a 5x7 grid, as shown in figure 2.4.

Figure 2.4, the first four letters of the alphabet.

The correct way to train the network is to apply the first letter and change ALL the

weights in the network ONCE. Next apply the second letter and do the same, then

the third and so on. Once you have done all four letters, return to the first one again

and repeat the process until the error becomes small (which means that it is

recognizing all the letters). Figure 2.5 summarizes how the algorithm should work.

Page 16: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

16

Figure 2.5, the correct running of the algorithm.

2.5 End of training When do we stop the training? We could stop it once the network can recognize

all the letters successfully, but in practice it is usual to let the error fall to a lower

value first. This ensures that the letters are all being well recognized. You can

evaluate the total error of the network by adding up all the errors for each

individual neuron and then for each pattern in turn to give you a total error as

shown in figure 2.6.

Page 17: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

17

Figure 2.6, total error for network.

In other words, the network keeps training all the patterns repeatedly until the total

error falls to some pre-determined low target value and then it stops. When the

network has been trained, it should be able to recognize not just the perfect

patterns, but also corrupted or noisy versions.

Page 18: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

18

CCHHAAPPTTEERR 3

3.1 TTHHEE PPRROOGGRRAAMM IINNTTEERRFFAACCEE::

1: a board for drawing the pattern.

2: List to add pattern name.

3: Combo list to resize a board.

4: Text to write pattern name. 5: Command “ADD”: to add pattern name to the list and Storing the

pattern shape in input array.

6: Command “LEARN”: after preparation the input array and determine

the number of inputs and outputs the neural network will be trained.

Page 19: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

19

7: Command “RECOGNIZE”: after Stage of training the neural will be

able to recognize any pattern shape that similar to the input pattern with

distortion.

8: Command “CLEAR”: to clear the drawing board and the text.

33..22 HHOOWW TTOO RRUUNN TTHHEE PPRROOGGRRAAMM?? - To run the program, follow these steps:

Step1: Set the pattern size by choose number from combo list

The left combo represent height and the right combo represent width.

Private Sub Combo1_Click()

a = Combo1.List(Combo1.ListIndex)

noinp = a * b

order

flagedim = True: flagew = True

End Sub

Private Sub Combo2_Click()

b = Combo2.List(Combo2.ListIndex)

noinp = a * b

order

flagedim = True: flagew = True

End Sub

Page 20: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

20

Public Sub order()

h = Frame1.Height / a: w = Frame1.Width / b

For i = 0 To 80

P1(i).Height = h: P1(i).Width = w: P1(i).BackColor = vbWhite

Next i

topp = 100: leftt = 0: t1 = 0: t2 = b

For j = 1 To a

For i = t1 To t2

P1(i).Top = topp

P1(i).Left = leftt

leftt = leftt + w

Next i

t1 = b * j

t2 = t1 + (b - 1)

leftt = 0

topp = (h * j) + 100

Next j

End Sub

Page 21: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

21

Step2: draw pattern by Press right click And move the mouse to draw line

like the figure:

To clear the field press left click and move the mouse:

Private Sub P1_MouseDown(Index As Integer, Button As Integer, Shift As

Integer, x As Single, Y As Single)

t = Button

End Sub

Private Sub P1_MouseMove(Index As Integer, Button As Integer, Shift As Integer,

x As Single, Y As Single)

If t = 1 Then

P1(Index).BackColor = RGB(255, 10, 40)

Else: If t = 2 Then P1(Index).BackColor = vbWhite

End If

End Sub

Page 22: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

22

Step3: draw pattern you want:

And write pattern name then press “ADD” button to add the pattern to the list of

training group.

Private Sub Command1_Click()

If c = 0 Then

Combo1.Enabled = False: Combo2.Enabled = False

Command4.Enabled = True: s.Enabled = True

End If

If flagedim Then

ReDim inpt(20, noinp)

flagedim = False

End If

If Text1.Text = "" Then

MsgBox "you must enter the pattern name", vbOKOnly, "error"

Else: List1.AddItem Text1.Text

inpt(c, 0) = 1

For i = 1 To noinp - 1

If P1(i).BackColor = vbWhite Then

inpt(c, i) = 0

Else: inpt(c, i) = 1

End If

Next i

c = c + 1

End If: End Sub

Page 23: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

23

Step4: after complete enter all the pattern you want to training press “learn”

button to begin the network training.

Private Sub Command4_Click()

Frame1.Enabled = False

Picture2.Scale (1, 1)-(4000, 2)

Picture2.BackColor = vbWhite

Form1.Cls

If flagew Then

Command3.Enabled = True

Command1.Enabled = False

o.Enabled = False

initweight

End If

hidden(0) = 1 '**********first hidden node equal to 1

ter = 1

noitr = 0

While ((ter > 0.00005) And (noitr < 1000 * nooup))

ter = 0

For k = 0 To nooup - 1

err1 = 0

'**** hidden=1/1+exp(-sum(weight*input)) ****

For i = 1 To noinp - 1

sum = 0

Page 24: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

24

For j = 0 To noinp - 1

sum = sum + (weight1(j, i) * inpt(k, j))

Next j

hidden(i) = 1 / (1 + Exp(-(sum)))

Next i

'**** output = 1/1+exp(-sum(weight*hidden)) ****

For i = 0 To nooup - 1

sum = 0

For j = 0 To noinp - 1

sum = sum + (weight2(j, i) * hidden(j))

Next j

outpt(k, i) = 1 / (1 + Exp(-(sum)))

Next i

'**** delta2 = (target-output) x (1-output) x output ****

For i = 0 To nooup - 1

delta2(i) = (target(k, i) - outpt(k, i)) * (1 - outpt(k, i)) * outpt(k, i)

Next i

'w2+ = w2 + (delta2 x hidden)

For j = 0 To nooup - 1

For i = 0 To noinp - 1

weight2(i, j) = weight2(i, j) + (0.5 * delta2(j) * hidden(i))

Next i, j

'delta1=delta2 x weight2 x output x (1-output)

For i = 0 To noinp - 1

For j = 0 To nooup - 1

delta1(i) = delta2(j) * weight2(i, j) * outpt(k, j) * (1 - outpt(k, j))

Next j, i

'w1+ = w1+(delta1 x input)

For j = 0 To noinp - 1

For i = 0 To noinp - 1

weight1(i, j) = weight1(i, j) + (0.5 * delta1(j) * inpt(k, i))

Next i, j

Page 25: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

25

For i = 0 To nooup - 1

err(i) = target(k, i) - outpt(k, i)

err1 = err1 + (err(i) * err(i))

Next i

err1 = Abs(err1 / nooup)

ter = ter + err1

Next k

ter = Abs(ter / nooup)

noitr = noitr + 1

Picture2.Line (1, 1)-((noitr / nooup), 2), RGB(255, 10, 40), BF

Wend

Picture2.BackColor = RGB(255, 10, 40)

For k = 0 To nooup - 1

For i = 0 To nooup - 1

If k = i Then

knw = (Int(outpt(i, k) * 10000) / 100)

MsgBox "known " & List1.List(i) & " " & knw & "%", vbOKOnly

End If

Next i

Next k

err1 = (Int(err1 * 1000000)) / 1000000

x = MsgBox("total error is " & err1 & " do you want to learn again ", vbYesNo)

If x = 6 Then Command4.Value = True

Frame1.Enabled = True

End Sub

******************************************************************

After complete training, appear Message will show the proportion of pattern

recognition:

Page 26: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

26

Then appear message will show the proportion of total error.

Step5: After completion of training, the neural network will be able to recognize

any patterns similar to patterns that trained.

To test the neural network: draw pattern with some distortion and press

“recognize” button, a message will appear showing the name of the pattern and

the proportion of pattern recognition

Page 27: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

27

Private Sub Command3_Click()

Form1.Cls

p(0) = 1

For i = 1 To noinp - 1

If P1(i).BackColor = RGB(255, 10, 40) Then

p(i) = 1

Else

p(i) = 0

End If

Next i

For i = 1 To noinp - 1

sum = 0

For j = 0 To noinp - 1

sum = sum + (weight1(j, i) * p(j))

Next j

hidden(i) = 1 / (1 + Exp(-(sum)))

Next i

'**** output = 1/1+exp(-sum(weight*hidden)) ****

For i = 0 To nooup - 1

sum = 0

For j = 0 To noinp - 1

sum = sum + (weight2(j, i) * hidden(j))

Next j

ot(i) = 1 / (1 + Exp(-(sum)))

Next i

max = ot(0): dx = 0

For i = 1 To nooup - 1

If ot(i) > max Then

max = ot(i)

dx = i

End If

Next i

ot(dx) = Int(ot(dx) * 10000) / 100

MsgBox "it is " & List1.List(dx) & " " & ot(dx) & " " & "%", vbOKOnly

End Sub

Page 28: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

28

Chapter4

4.1 Conclusions and recommendations.

In light of the analysis of simulation experiments and practical application and

what has reached conclusions, below some of the conclusions and

recommendations proposed: -

1 - The failure of the network in pattern recognition due to the inefficiency

in the selection of samples and not enough training of the network, so you must

choose carefully the samples and give the network enough time to train for the

best results.

2 - The final values of the weights of the network depends mainly on the

primary weights have been generated, in the case of failure to reach the required

results; it must generate new weights and the training of the network again.

3 - A database for storage of samples, ready to facilitate the work, and avoid

re-enter the same data every time.

Page 29: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

29

References

1. Neural Networks for System Identification, Asmaa.Q.Shareef 1995.

2. Rumelhat, D.E., Mc Clelland,J.L.8 the PDP Research Group.”Parallel

Distributed Processing”; Computational Models of Cognition & Perception, vol.2,

1988.

3. Use of Artificial Neural Network in Pattern Recognition; Jayanta Kumar

Basu , Debnath Bhattacharyya , Tai-hoon Kim.

4. P.E. Ross, "Flash of Genius", Forbes, pp. 98-104, Nov, 1998. 5. NEURAL NETWORKS ,by Christos Stergiou and Dimitrios Siganos 6. http://www.ieee.cz/knihovna/Zhang/Zhang100-ch03.pdf.

7. http://www4.rgu.ac.uk/files/chapter3%20-%20bp.pdf.

8. http://en.wikipedia.org/wiki/Neural_network.

9. http://en.wikipedia.org/wiki/Backpropagation.

Page 30: Project Name Pattern recognition by using neural network · 2013-06-17 · Project Name Pattern recognition by using neural network ... stimulant according to the immanent processing

30

انعشاق جهىسيت

وصاسة انتعهيى انعاني

وانبحث انعهي

جايعت بغذاد

كهيت انعهىو

انششوع اسى

انشبكت انعصبيت باستخذاو انتعشف عهى األاط

جايعت،كهيت انعهىو إنى يقذو يششوع تقشيش ع

في انعهىو االحتياجاث نذسجت انبكانىسيىط انتفيز انجضئي فقط ي فيبغذاد

عهىو انحاسب اآلني في يجال

ي قبم انطانبت

سهى سها حسي

االستار انششف

د.أساء قاسى ششيف

3122-3123