network of neurons

29
Network of Neurons Computational Neuroscience 03 Lecture 6

Upload: cardea

Post on 27-Jan-2016

32 views

Category:

Documents


0 download

DESCRIPTION

Network of Neurons. Computational Neuroscience 03 Lecture 6. Connecting neurons in networks. Last week showed how to model synapses in HH models and integrate and fire models:. Can add them together to form networks of neurons. Use cable theory: R L = r L D x/( p a 2 ) - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Network of Neurons

Network of Neurons

Computational Neuroscience 03

Lecture 6

Page 2: Network of Neurons

Connecting neurons in networks

emslssmLm IREVPPgrVEdt

dVc )(Re

Last week showed how to model synapses in HH models and integrate and fire models:

Can add them together to form networks of neurons

Page 3: Network of Neurons

Use cable theory:

RL = rL x/(a2)

And multicompartmental modelling to model propagation of signals between neurons

)()( 11,11,

VVgVVg

A

Ii

dt

dVc e

mm

Page 4: Network of Neurons

However, this soon leads to very complex models and very computationally intensive

Massive amounts of numerical integration is needed (can lead to accumulation of truncation errors

Need to model neuronsl dynamics on the milisecond scale while netpwrk dynamics can be several orders of magnitude longer

Need to make a simplification …

Page 5: Network of Neurons

Firing Rate ModelsSince the rate of spiking indicates synaptic activity, use the firing rate as the information in the network

However AP’s are all-or-nothing and spike timing is stochastic

With identical input for the identical neuron

spike patterns are similar, but not identical

Page 6: Network of Neurons

Single spiking time is meaninglessTo extract useful information, we have to average

to obtain the firing rate r

for a group of neurons in a local circuit where neuron codes the same information over a time window

Local circuit

=

Time window = 1 sec

r =

Hz

Page 7: Network of Neurons

So we can have a network of these local groups

w1: synaptic strength

wn

r1

rn

)( jjrwfv

Hence we have firing rate of a group of neurons

Page 8: Network of Neurons

Much simpler modelling eg don’t need milisecond time scales

Can do analytic calculations of some aspects of network dynamics

Spike models have many free parameters – can be difficult to set (cf Steve Dunn)

Since AP model responds deterministically to injected current, spike sequences can only be predicted accurately if all inputs are known. This is unlikely

Although cortical neurons have many connections, probability of 2 randomly chosen neurons being connected is low. Either need many neurons to replicate network connectivity or need to average over a more densely connected group. How to average spikes? Typically an ‘average’ spike => all neurons in unit spike synchronously => large scale synchronisation unseen in (healthy) brain

Advantages

Page 9: Network of Neurons

Can’t deal with issues of spike timing or spike correlations

Restricted to cases where neuronal firing is uncorrelated with little synchronous firing (eg where presynaptic inputs to a large fraction of neurons is correlated) + where precise patterns of spike timing unimportant

If so, models produce similar results.

However, both styles are clearly needed

Disadvantages

Page 10: Network of Neurons

1. work out how total synaptic input depends on firing rates of presynaptic afferents

2. Model how firing rate of postsynaptic neuron depends on this input

Generally determine 1 by injecting current into soma of neurons and measuring responses. Therefore, define total synaptic input to be total current in soma due to presynaptic AP’s, denoted by Is

Then work out postsynaptic rate v from IS using:

v = F(IS )

F is the activation function. Sometimes use the sigmoid (useful if derivatives are needed in analysis). Often use threshold linear function F=[IS – t]+ (linear but IS = 0 for IS < t. For t =0 known as half-wave rectification

The model

Page 11: Network of Neurons

Although Is determined by injection of constant current, can assume that the same response is true when Is is time dependent ie

v = F(IS(t))

Thus dynamics come from synaptic input. This is presynaptic input which is effectively filtered by dynamics of current propagation from synapse to soma. Therefore use:

Firing rate models with current dynamics

rwIrwIdt

dIs

N

iiis

ss .

1

Time constant s If electrotonically compact, roughly same as decay of synaptic conductance, but typically low (milliseconds)

Page 12: Network of Neurons

Visualise effect of s as follows. Imagine I starts at some value I0 and we have sliced time into discrete pieces t. At n’th time step have:

I(nt) = In = In-1 + t dI/dt

Imagining w.r =0 have:

Effect of s

n

sn

ss

ss

tII

tI

tII

tII

tII

1

11

1

0

2

012

0001

Exponential decay

Page 13: Network of Neurons

Alternatively, if w.r not 0

....1.1.

1.1.

1.1.

1.

2

2

1

2

2

1

21

1

nss

nss

ns

n

ns

nss

ns

n

ns

nss

ns

n

ns

ns

n

rwtt

rwtt

rwt

I

It

rwtt

rwt

I

It

rwtt

rwt

I

It

rwt

I

Ie it retains some memory of activity at previous time-step (which itself retained some memory of time step before etc etc).Sort of a time average

How much is retained or for how long we average depends on s as it governs how quick things change. If its 0 none retained if large lot retained

Page 14: Network of Neurons

s= 1

s= 4

s= 4

s= 0.1

Delays the response to the input Also dependent on starting position

Page 15: Network of Neurons

s= 0.1

Filters input based on size of time constant

Page 16: Network of Neurons

s= 1

Filters input based on size of time constant

Page 17: Network of Neurons

s= 4

Filters input based on size of time constant

Page 18: Network of Neurons

Filters input based on size of time constant

Page 19: Network of Neurons

Alternatively, since postsynaptic rate is caused by changes in membrane potential, can add in effects membrane capacitance/resistance. This also effectively acts as a low pass filter giving:

))(( tIFvdt

dvsr

If r << s then v = F(IS(t)) pretty quickly so 2nd model reduces to first. Alternatively if s << r (more usual) we get:

).( rwFvdt

dvr

Cf leaky integrator, continuous time recurrent nets

Page 20: Network of Neurons

Models with only one set of dynamics work well for above threshold inputs as low pass thresholding irrelevant, but when signal is below threshold for a while these dynamics become important and both levels are needed

Page 21: Network of Neurons

For a network replace weight vector by a matrix. Also often replace feedforward input with a vector

Feedforward and Recurrent networks

)()( rMhFvrMrWFvdt

dvr

Dale’s law states that a neuron can’t both inhibit and excite neurons so wieghts in each row of matrices must have the same sign ie Maa’

(weight from a’ to a) must be +ve or –ve for all a

Page 22: Network of Neurons

This means that except for special cases M cannot be symmetric since if a’ inhibits a, unless a also inhibits a’ then Maa’ has a different sign to Ma’a

However, anlaysis of systems is much easier when a is symmetric. Corresponds to making inhibitory dynamics instantaneous.

These systems are studied for their analytical properties but systems where excitatory-inhibitory networks

have much richer dynamics exhibiting eg oscillatory behaviour

)(

)(

IIIEIEIIII

I

IEIEEEEEEE

E

rMrMhFvdt

dv

rMrMhFvdt

dv

Page 23: Network of Neurons

Often identify each neuron in a network by a parameter describing an aspect of its selectivity. Eg for neurons in the primary visual cortex can use their preferred spatial phase (ie what angle of line they respond most to)

Then look at firing rates as a function of this parameter: v(r

In large networks there will be a large range of parameters. Assume that the density of each is uniform and equal to p and coverage is dense. Replace the weight matrices by functions W(’) and M(’) which describe the weights from a presynaptic neuron with preferred angle ’ to a postsynaptic neuron with preferred angle we get:

Continuous model

')'()',()'()',()(

)(

drMrWpFv

dt

dvr

Page 24: Network of Neurons

Pure feedforward nets can do many things and eg can be shown to be able to perform coordinate transformations (habd to body for reaching)

To do this they must exhibit gaze dependent gain modulation: peak firing rate not shifted by a change in gaze location but increased

Page 25: Network of Neurons

Recurrent networks can also do this but have much more complex dynamics than feedforward nets. Also more difficult to analyse

Much analysis focuses on looking at the eigenvectors of the matrix M

Can show for instance that networks can exhibit selective amplification if there is one dominant eigenvector (cf PCA)

Page 26: Network of Neurons

Or if an eigenvalue is exactly equal to 1 and others < 1can get integration of inputs and therefore persistent activity as activity does not stop when input stops

While synaptic modification rules can be used to establish such precies tuning it is not clear how this is done in biological systems

Page 27: Network of Neurons

Also can see that recurrent networks exhibit stereotypical patterns of activity largely determined by recurrent interactions and can be independent of feedforwrd input and thus can get sustained activity

Input Output

Therfore recurrent connections can act as a form of memory

Page 28: Network of Neurons

Such memory is called working or short term memory (seconds to hours)

To establish long term memories idea is that memory is encoded in the synaptic weights.

Weights are set when memory is stored.

When a similar (or incomplete) feedforward input arrives to the one that created the memory, persistent activity signals memory recall

Associative memory: recurrent weights are set so that network has several fixed points which are identical to the patterns of activity representing the stored memories. Each fixed point has a basin of attraction representing the set of inputs which will result in the net ending up at that fixed point. When presented with an input network effectively pattern matches input to stored patterns

Can thus examine capacity of networks to remember patterns by analysing stability properties of matrix encoded by synaptic weights

Page 29: Network of Neurons

Interplay of excitatory and inhibitory connections can be shown to give rise to oscillations in networks

Network analysis now problematic so use homogenous excitatory and inhibitory populations of neurons (effectively 2 neuron-groups) and examine a phase plane anlalysis.

Can show that non-linearity of activation function allows for stable limit cycles

Can also look at stochastic networks where input current is interpreted as a probability of firing: Boltzmann machines. Now need statistical analysis of network properties