psy105 neural networks 5/5

25
PSY105 Neural Networks 5/5 5. “Function – Computation - Mechanism”

Upload: hamilton-moore

Post on 15-Mar-2016

33 views

Category:

Documents


1 download

DESCRIPTION

PSY105 Neural Networks 5/5. 5. “Function – Computation - Mechanism”. Lecture 1 recap. We can describe patterns at one level of description that emerge due to rules followed at a lower level of description. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: PSY105 Neural Networks 5/5

PSY105 Neural Networks 5/5

5. “Function – Computation - Mechanism”

Page 2: PSY105 Neural Networks 5/5

Lecture 1 recap

• We can describe patterns at one level of description that emerge due to rules followed at a lower level of description.

• Neural network modellers hope that we can understand behaviour by creating models of networks of artificial neurons.

Page 3: PSY105 Neural Networks 5/5

Lecture 2 recap

• Simple model neurons– Transmit a signal of (or between) 0 and 1– Receive information from other neurons– Weight this information

• Can be used to perform any computation

Page 4: PSY105 Neural Networks 5/5

Lecture 3 recap

• Classical conditioning is a simple form of learning which can be understood as an increase in the weight (‘associative strength’) between two stimuli (one of which is associated with an ‘unconditioned response’)

Page 5: PSY105 Neural Networks 5/5

Lecture 4 recap

• A Hebb Rule for weight change between two neurons is:– Δ weight = activity 1 x activity 2 x learning rate constant

• In order to use this rule to associate two stimuli which are separated in time we need neuron activity associated with stimuli to persist in time.– This can be implemented as an ‘eligibility trace’

Page 6: PSY105 Neural Networks 5/5

• I present you with a robot which uses a simple neural network to acquire classically conditioned responses. It can, for example, learn to associate a warning stimulus with an upcoming wall and hence turn around before it reaches the wall. Describe an experiment which you would do to test the details of how the robot learns. Say what you would do, and what aspect(s) of the robot's learning the results would inform you of, and why.

Page 7: PSY105 Neural Networks 5/5

The problem of continuous time

Stimulus 1

Stimulus 2

Page 8: PSY105 Neural Networks 5/5

Traces

Stimulus 1

Stimulus 2

Page 9: PSY105 Neural Networks 5/5

Consequences of this implementation

• Intensity of CS stimulus• Duration of CS stimulus• Intensity of UCS stimulus• Duration of UCS stimulus• Separation in time of CS and UCS• The order in which the CS and UCS occur – (cf. Rescola-Wagner discrete time model)

Page 10: PSY105 Neural Networks 5/5

We have designed an information processing system that learns associations

Trials

Asso

ciati

on

Sutton, R.S., Barto, A.G. (1990). Time-derivative models of pavlovian reinforcement. In Learning and Computational Neuroscience: Foundations of Adaptive Networks, M. Gabriel and J. Moore, Eds., pp. 497-537. MIT Press.

Page 11: PSY105 Neural Networks 5/5

Without continued pairing

Trials

Asso

ciati

on

Stop

pai

ring

Page 12: PSY105 Neural Networks 5/5

Without continued pairing -> extinction

Trials

Asso

ciati

on

Stop

pai

ring

Page 13: PSY105 Neural Networks 5/5

Analysis of information processing systems

• Function (‘computational level’)• Computation (‘algorithmic level’)• Mechanism (‘implementational level’)

Marr, David (1982) Vision. San Francisco: W.H. Freeman

Page 14: PSY105 Neural Networks 5/5

Marrian analysis of classical condtioning

• Function: learn to predict events based on past experience

• Computation: Stimuli evoke ‘eligibility traces’. Hebb Rule governs changes in weights [+ other additional assumptions which are always needed when you try and make a computational recipe]

• Mechanism: At least one response neuron, one unconditioned stimulus neuron and one neuron for each conditioned stimulus

Page 15: PSY105 Neural Networks 5/5

Kim, J. J., & Thompson, R. F. (1997). Cerebellar circuits and synaptic mechanisms involved in classical eyeblink conditioning. Trends in Neurosciences, 20(4), 177-181.

Page 16: PSY105 Neural Networks 5/5

Marrian analysis: a simple example

• Function

• Computation

• Mechanism

Page 17: PSY105 Neural Networks 5/5

Theory < - > Experiments

Synthesis < - > Analysis

Page 18: PSY105 Neural Networks 5/5

Our classical conditioning networks

Stimuli

Responses

CS1 UCSCS2

S-S link

Page 19: PSY105 Neural Networks 5/5

Internal representation of the conditioned stimulus

Page 20: PSY105 Neural Networks 5/5

The lab : using Marrian analysis to make predictions

Page 21: PSY105 Neural Networks 5/5

Function

• What is the purpose of learning for an animal?– Does our model behave in a sensible (‘adaptive’)

way when it follows our rule?– Is the rule sufficient to explain animal learning?

• Test: think of a way you would want the model/robot to behave, test if it does

Page 22: PSY105 Neural Networks 5/5

Computation• Intensity of CS stimulus• Duration of CS stimulus• Intensity of UCS stimulus• Duration of UCS stimulus• Separation in time of CS and UCS• The order in which the CS and UCS occur – (cf. Rescola-Wagner discrete time model)

• The learning rate • The rate of decay of the trace• The frequency of pairing

Page 23: PSY105 Neural Networks 5/5

Mechanism

Stimuli

Responses

CS1 UCSCS2

S-S link

Page 24: PSY105 Neural Networks 5/5

• What is your prediction? What will you do to the rule or the environment?

• How will you know if it has been confirmed or falsified?

Page 25: PSY105 Neural Networks 5/5

ψπΩ