computer science department fmipa ipb 2003 neural computing yeni herdiyeni computer science dept....

35
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Upload: sheldon-childres

Post on 14-Dec-2015

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Neural Computing

Yeni HerdiyeniComputer Science Dept. FMIPA IPB

Page 2: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Neural Computing : The Basic

Artificial Neural Networks (ANN)

Mimics How Our Brain Works

Machine Learning

Neural Computing = Artificial Neural Networks (ANNs)

Page 3: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Machine Learning : Overview

ANN to automate complex decision making

Neural networks learn from past experience and improve their performance levels

Machine learning: methods that teach machines to solve problems or to support problem solving, by applying historical cases

Page 4: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Neural Network and Expert System

Different technologies complement each other

Expert systems: logical, symbolic approach

Neural networks: model-based, numeric and associative processing

Page 5: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Expert System

Good for closed-system applications (literal and precise inputs, logical outputs)

Reason with established facts and pre-established rules

Page 6: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Major Limitation ES

Experts do not always think in terms of rules

Experts may not be able to explain their line of reasoning

Experts may explain incorrectly

Sometimes difficult or impossible to build knowledge base

Page 7: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Neural Computing Use :

Neural Networks in Knowledge Acquisition

Fast identification of implicit knowledge by automatically analyzing cases of historical data

ANN identifies patterns and relationships that may lead to rules for expert systems

A trained neural network can rapidly process information to produce associated facts and consequences

Page 8: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Benefit NN

Pattern recognition, learning, classification, generalization and abstraction, and interpretation of incomplete and noisy inputs

Character, speech and visual recognition Can provide some human problem-solving

characteristics Can tackle new kinds of problems Robust Fast Flexible and easy to maintain Powerful hybrid systems

Page 9: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Biology Analogy : Biological Neural Network

Neurons: brain cells– Nucleus (at the center)– Dendrites provide inputs – Axons send outputs

Synapses increase or decrease connection strength and cause excitation or inhibition of subsequent neurons

Page 10: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Biology Analogy : Biological Neural Network

Input data

Dendrite input wire

Neuron #1

Axon (output wire)

Weight W 1,2

Dendrite Neuron #2

Axon

Synapse (control of flow of electrochemical fluids

Neuron #3

Data signals

Page 11: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Neural Network ?

Neural Network is a networks of many simple processors, each possibly having a small amount of local memory.

The processors are connected with communication channels (synapses).

Page 12: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Neural Network (Haykin*)

Neural Network is a massively parallel-distributed processor that has a natural prosperity for storing experiential knowledge and making it available for use.

Simon Haykin, Neural Networks: A Comprehensive Foundation, Prentice Hall Inc., New Jersey, 1999.

Page 13: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Neural Net = Brain ?

1. Knowledge is acquired by the network through a learning process.

2. Inter-neuron connection strengths known as synaptic weights are used to store the knowledge.

Page 14: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Neural Network Fundamentals

Components and Structure– Processing Elements– Network– Structure of the Network

Processing Information by the Network– Inputs– Outputs– Weights– Summation Function

Page 15: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Processing Information in an Artificial Neuron

x1 w1j

x2

xi

Yj

wij

w2jNeuron j wij xi

Weights

Output

Inputs

Summations Transfer function

Page 16: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Learning : 3 Tasks

1. Compute Outputs

2. Compare Outputs with Desired Targets

3. Adjust Weights and Repeat the Process

Page 17: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Training The Network

Present the training data set to the network

Adjust weights to produce the desired output for each of the inputs– Several iterations of the complete training

set to get a consistent set of weights that works for all the training data

Page 18: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Testing

Test the network after training Examine network performance: measure the

network’s classification ability Black box testing Do the inputs produce the appropriate outputs? Not necessarily 100% accurate But may be better than human decision makers Test plan should include

– Routine cases – Potentially problematic situations

May have to retrain

Page 19: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

ANN Application Development Process

1. Collect Data2. Separate into Training and Test Sets3. Define a Network Structure4. Select a Learning Algorithm5. Set Parameters, Values, Initialize Weights6. Transform Data to Network Inputs7. Start Training, and Determine and Revise Weights8. Stop and Test9. Implementation: Use the Network with New Cases

Page 20: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Data Collection and Preparation

Collect data and separate into a training set and a test set

Use training cases to adjust the weights

Use test cases for network validation

Page 21: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Single Layer Perceptron

Page 22: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Each pass through all of the training input and target vector is called an epoch.

Page 23: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Example :

Page 24: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Page 25: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Page 26: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Disadvantage Perceptron Perceptron networks

can only solve linearly separable problems

see:Marvin Minsky and Seymour Papert’s book Perceptron [10].

[10] M.L. Minsky, S.A. Papert, Perceptrons: An Introduction To Computational Geometry, MIT Press, 1969.

See XOR problem

Page 27: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Multilayer Perceptrons (MLP)

Page 28: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

MLP

MLP has ability to learn complex decision boundaries

MLPs are used in many practical computer vision applications involving classification (or supervised segmentation).

Page 29: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Backpropagation

Page 30: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Page 31: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

X = -1 : 0.1 : 1;

Y = [-0.960 -0.577 -0.073 0.377 0.641 0.660 0.461...0.134 -0.201 -0.434 -0.500 -0.393 -0.165 0.099...0.307 0.396 0.345 0.182 -0.031 -0.219 -0.320];

Normalisasi :pr = [-1 1]; m1 = 5; m2 = 1;

net_ff = newff (pr, [m1 m2], {'logsig' 'purelin'});

net_ff = init (net_ff); %Default Nguyen-Widrow initialization

%Training:net_ff.trainParam.goal = 0.02;net_ff.trainParam.epochs = 350;

net_ff = train (net_ff, X, Y);

%Simulation:X_sim = -1 : 0.01 : 1;Y_nn = sim (net_ff, X_sim);

Page 32: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Backpropagation

Backpropagation (back-error propagation) Most widely used learning Relatively easy to implement Requires training data for conditioning the

network before using it for processing other data

Network includes one or more hidden layers Network is considered a feedforward

approach

Page 33: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Externally provided correct patterns are compared with the neural network output during training (supervised training)

Feedback adjusts the weights until all training patterns are correctly categorized

Page 34: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Error is backpropogated through network layers

Some error is attributed to each layer Weights are adjusted A large network can take a very long

time to train May not converge

Page 35: Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB

Computer Science Department FMIPA IPB 2003

Next Time …..

ANFIS Neural NetworkBy Ir. Agus Buono, M.Si, M.Komp