artificial neural networks (2)

20
ARTIFICIAL NEURAL NETWORKS Sai Anjaneya

Upload: sai-anjaneya

Post on 14-Apr-2017

40 views

Category:

Education


4 download

TRANSCRIPT

Page 1: Artificial neural networks (2)

ARTIFICIAL NEURAL NETWORKS Sai Anjaneya

Page 2: Artificial neural networks (2)

WHAT ARE NEURAL NETWORKS?• In machine learning and cognitive

science, artificial neural networks (ANNs) are a family of models inspired by biological neural networks (the central nervous systems of animals, in particular the brain) and are used to estimate or approximate functions that can depend on a large number of inputs and are generally unknown. The brain consists of a densely interconnected set of nerve cells, or basic information-processing units, called neurons.

Page 3: Artificial neural networks (2)

Neural Networks in the Brain

Biological Neural NetworkIn human brain a neuron is a information processing unit

whichreceives several input signals from the environment, computes a new activation level and sends an output signal to the other neurons or body muscles through output links.

Page 4: Artificial neural networks (2)

• Neural networks exhibit plasticity. • In response to the stimulation pattern,

neurons demonstrate long-term changes in the strength of their connections.

• Neurons also can form new connections with other neurons. Even entire collections of neurons may sometimes migrate from one place to another.

• These mechanisms form the basis for learning in the brain.

Page 5: Artificial neural networks (2)

MACHINE LEARNING• In general, machine learning

involves adaptive mechanisms that enable computers to learn from experience, learn by example and learn by analogy.

• Learning capabilities can improve the performance of an intelligent system over time.

• Machine learning mechanisms form the basis for adaptive systems.

Page 6: Artificial neural networks (2)

WHY MACHINE LEARNING?• The techniques that we’ve seen before relies on expert knowledge to set the

rules. Once the rules are set, the decision making is automated.• What happens if the rules become obsolete or new information is gathered?• The change, then needs to happen at a very basic level and needs to be

done manually.• ANNs attempt to automate the process.• The objective is to come up with a model to predict a set of outputs Y <y1,

y2,…, yn> from a given set of inputs X <x1, x2,…, xm> given training dataset with records of the form (X, Y).

• The result must be a function f(X) that approximates Y for values of X not in the dataset.

Page 7: Artificial neural networks (2)

THE PERCEPTRON

As like as human brain, in ANN the PERCEPTRON is the simplest form of a neural network. It consists of a single “neuron” which computes an output function by assigning weights to each of the links to the n parameters.

Page 8: Artificial neural networks (2)

How does the Perceptron learn from experience?Weights(w1,w2….) are assigned to inputs of perceptron initially in the range [-0.5,0.5] and then updated to obtain the output consistent with the training examples. Thus weights Are updated and summation of these weights is calculated in linear combiner at each training level.

 

Page 9: Artificial neural networks (2)

 

Hard Limiters: Step and Sign

Page 10: Artificial neural networks (2)

UPDATE RULES• Error function for pth training example:

• e(p) = Yd(p) – Y(p); where p = 1, 2, 3, . . .• Update:

• wi(p+1) = wi(p) + α × xi(p) × e(p); • where α is the learning rate, a positive constant less than unity

• Only works on Linearly Separable data.

Page 11: Artificial neural networks (2)

LIMITATIONS

Page 12: Artificial neural networks (2)

MULTILAYERED NETWORKS

Page 13: Artificial neural networks (2)

BACKPROPAGATION• The indices i, j and k here refer to neurons in the input,

hidden and output layers, respectively.

• Input signals, x1, x2, . . . , xn, are propagated through the network from left to right, and error signals, e1, e2, . . .,el, from right to left. The symbol wij denotes the weight for the connection between neuron i in the input layer and neuron j in the hidden layer, and the symbol wjk the weight between neuron j in the hidden layer and neuron k in the output layer.

The error signal at the output of neuron k at iteration p is defined by. Update rule

Page 14: Artificial neural networks (2)

Outer Layer Hidden Layer(s)

Page 15: Artificial neural networks (2)

WHEN TO STOP

Page 16: Artificial neural networks (2)

IMPROVEMENTSA multilayer network, in general, learns much faster when the sigmoidal activation function is represented by a hyperbolic tangent:

We also can accelerate training by including a momentum term in the earlier expression for delta, according to the observations made in Watrous (1987) and Jacobs (1988), the inclusion of momentum in the back-propagation algorithm has a stabilising effect on training:

Page 17: Artificial neural networks (2)

LINKS

• Intro• Forward Propagation [until 3:45]• Gradient Descent [Cost, Curse of Dimensionality not covered]• Backpropagation

Page 18: Artificial neural networks (2)

SELF-ORGANISING MAPS(LEARNING WITHOUT A

‘TEACHER’)• Hebb’s Law:

• If two neurons on either side of a connection are activated synchronously, then the weight of that connection is increased.

• If two neurons on either side of a connection are activated asynchronously, then the weight of that connection is decreased.

Forgetting Factor (To allow for the possibility of decreasing connection strength):

Page 19: Artificial neural networks (2)

HOW TO CHOOSE THE FORGETTING FACTOR

• Forgetting factor specifies the weight decay in a single learning cycle. It usually falls in the interval between 0 and 1.

• If the forgetting factor is 0, the neural network is capable only of strengthening its synaptic weights, and as a result, these weights grow towards infinity. On the other hand, if the forgetting factor is close to 1, the network remembers very little of what it learns.

• Therefore, a rather small forgetting factor should be chosen, typically between 0.01 and 0.1, to allow only a little ‘forgetting’ while limiting the weight growth.

Page 20: Artificial neural networks (2)