physics 567 nicolas brunel ([email protected])webhome.phy.duke.edu › ~nb170 › teaching ›...

31
Theoretical neuroscience PHYSICS 567 Nicolas Brunel ([email protected])

Upload: others

Post on 07-Feb-2021

4 views

Category:

Documents


0 download

TRANSCRIPT

  • Theoretical neuroscience

    PHYSICS 567

    Nicolas Brunel ([email protected])

  • Practical details

    • Problem sets will be issued every Thursday, are due the next Thursday in class.

    • Grading will be based on Attendance and participation (10%), homework (40%), and afinal exam (50%)

    • Office hours: Friday 3:30pm-5pm, Bryan 101G

  • Introduction

    • Brains, scales, structure, dynamics

    • What is computational/theoretical neuroscience?

    • Outline of the course

  • Brain networks: from C Elegans to Humans

    Animal # neurons # synapses

    C Elegans 302 7,500

    Fruit fly 2.5 105 107

    Honey bee 106 109

    Mouse 108 2 1011

    Cat 109 2 1012

    Human 1011 2 1014

  • Spatial scales

    ∼10cm Whole brain

    ∼1cm Brain structure/cortical area

    100µm- 1mm Local network

    10µm- 1mm Neuron

    100nm- 1µm Sub-cellular compartment

    ∼1-10nm Molecule

  • Experimental tools

  • The whole brain level: structures

  • Cerebral cortex: network of areas

  • Area level

    • Areas are interconnected networks of local networks (‘columns’)

  • Local network level

    • Size∼ cubic mm

    • Total number of cells∼ 100,000

    • Types of cells:– pyramidal cells - excitatory (80%)

    – interneurons - inhibitory (20%)

    • Total number of synapses ∼ 109

    (10,000 per neuron)

    • Cells connect potentially to all other celltypes (E→ E, E→ I, I→ E, I→ I)

    • Connection probability∼ 10%

  • Neuron level

    • Neuron = complex tree-like structures with many compartments (e.g. dendritic spines)

  • Subcellular compartment level

    • Subcellular compartments (e.g. dendritic spines) contain a huge diversity of molecules(in particular protein kinases and phosphatases) whose interactions define complex

    networks

  • Temporal scales

    Days-Years Long-term memory

    Seconds-Minutes Short-term (working) memory

    100ms - 1s Behavioral time scales/Reaction times

    ∼ 10ms Single neuron/synaptic time scales

    ∼ 1ms Action potential duration; local propagation delays

    � 1ms Channel opening/closing

  • Submillisecond

    • Molecular time scales (channel opening/closing; diffusion of neurotransmitter insynaptic cleft; etc)

  • Millisecond

    • Width of action potentials; axonal delays in local networks

  • Tens of ms

    • Synaptic decay time constants; membrane time constant of neurons; axonal delays forlong-range connections

  • Hundreds of ms

    • Behavioral time scales (e.g. motor response to a stimulus)

  • Seconds-minutes

    • Short-term memoryWorking memory

  • Days-years

    • Long-term memory

  • The art of model building��

    ��Experimental data

    �����

    ���Model building

    AAAU��

    ��Mathematical analysis����

    ��

    ��Experimental predictions

    AAAK

    • Modern Neuroscience generates increasingly large amounts of quantitative data. Needsquantitative models to make sense of this data.

    • Art of modeling: Needs proper balance between too little detail and too muchMake things as simple as possible, but not simpler. - A. Einstein

    • Model building incorporates biological facts and physical principles

    • Useful models make experimentally testable predictions→ stimulate new experiments.

  • Questions in theoretical/computational neuroscience

    What? Describe in a mathematically com-

    pact form a set of experimental obser-

    vations.

    How? Understand how a neural system

    produces a given behavior.

    Why? Understand why a neural system

    performs the way it does, using e.g.

    tools from information theory.

  • Mathematical tools

    • Single neuron/synapse models, rate models: systems of coupled differential equations.Tools of dynamical systems (linear stability analysis, bifurcation theory)

    • Networks: Graph theory, linear algebra. Large networks: tools of statistical physics

    • Noise (ubiquitous at all levels of the nervous system): Statistics, probability theory,stochastic processes.

    • Coding: Information theory

  • Outline

    Week 1 Jan 15, 17 Introduction; Membranes, channels

    Week 2 Jan 22, 24 Neurons: Hodgkin-Huxley type models

    Week 3 Jan 29, 31 Neurons: Leaky integrate-and-fire type models

    Week 4 Feb 5, 7 Synapses and synaptic plasticity

    Week 5 Feb 12, 14 Networks

    Week 6 Feb 19, 21 Network Dynamics: Rate models

    Week 7 Feb 26, 28 Network Dynamics: Networks of spiking neurons

    Week 8 Mar 5, 7 Coding: Single neurons

    Week 9 Mar 19, 21 Coding: Networks

    Week 10 Mar 26, 28 Unsupervised learning

    Week 11 Apr 2, 4 Supervised and reinforcement learning

    Week 12 Apr 9, 11 Learning and memory

    Week 13 Apr 16, 18 Computing

  • Single neuron models

    • Hodgkin-Huxley neuron

    CdV

    dt= −IL(V )−

    ∑a

    Ia(V,ma, ha)+Isyn(t)

    IL(V ) = gL(V − VL)

    Ia(V,ma, ha) = ga(V − Va)mxaa hyaa

    τma (V )dma

    dt= −ma +m∞a (V )

    τha (V )dha

    dt= −ha + h∞a (V )

    • Integrate-and-fire neuron

    CdV

    dt= −gL(V − VL) + Isyn(t)

    Fixed threshold Vt, reset Vr , refractory period;

    • How do single neurons work?

    • What are the mechanisms of action potential generation?

    • How do neurons transform synaptic inputs into a train of action potentials?

  • Single synapse models

    • ‘Conductance-based’

    Ii,syn(t) =∑

    a=E,I

    (V − Va)∑j,k

    ga,i,jsa(t− tkj )

    τaṡa = . . .

    • ‘Current-based’

    Ii,syn(t) =∑

    a=E,I

    ∑j,k

    Ja,i,jsa(t− tkj )

    Popular choices of s

    – Delayed difference of exponential;

    – Delayed delta function

    • How do synapses work?

    • What are the mechanisms of synapticplasticity on short time scales?

  • Synaptic plasticity

    Synaptic efficacy can be modified in various ways:

    • Spike timing(STDP experiments)

    • Firing rate(BCM, Sjostrom et al, etc)

    • Post-synaptic V(pairing, etc)

    • Can we capture this experimental data using simplified ‘plasticity rules’?

    • What are the mechanisms of induction of synaptic plasticity?

  • Networks: architecture

    Large scaleShort scale

    • How can we model networks at various scales?

  • Networks: dynamics

    • Rate model (neural mass model): de-scribe the activity of a whole population

    of neurons by a single ‘average firing

    rate’ variable m(x, t);

    • Networks of spiking neurons: de-scribe the activity of a population of

    N neurons by O(N) coupled differen-

    tial equations, coupled through network

    connectivity matrix.

  • Coding

    Tools from information theory:

    • Shannon mutual information

    • Fisher information

    • What contains information in single neurons spike trains/population activity?

    • How much information is contained in single neurons/population response?

    • What is the optimal way of transmitting information about a given input, given itsstatistics and other constraints? Are neural systems optimal/close to optimal at

    transmitting information?

  • Learning and memory

    • How do external stimuli triggers changes of neuronal activity in neural circuits?

    • How do changes of activity trigger in turn changes in synaptic connectivity?

    • How do changes in synaptic connectivity in turn change the dynamics of neuralcircuits?

    • How much information can a single neuron/a population of neurons store?

    • What kind of learning algorithms/rules allow to reach optimal storage?

  • Useful books

    • Tuckwell, “Introduction to Theoretical Neurobiology”, Vols. I & II (Cambridge U. Press, 1988)

    • Amit, “Modeling brain function: The world of attractor neural networks” (Cambridge U. Press, 1989)

    • Hertz, Krogh, and Palmer, “Introduction to the Theory of Neural Computation” (Addison-Wesley, 1991 - nowfrom: Perseus Book Group and Westview Press)

    • Rieke, Warland, de Ruyter van Steveninck, and Bialek, “Spikes: Exploring the Neural Code” (MIT Press,1997)

    • Koch, “Biophysics of Computation: Information Processing in Single Neurons” (Oxford U. Press, 1999)

    • Dayan and Abbott, “Theoretical Neuroscience: Computational and Mathematical Modeling of NeuralSystems” (MIT Press, 2001)

    • Izhikevich, “Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting” (MIT Press,2007)

    • Ermentrout and Terman, “Mathematical Foundations of Neuroscience” (Springer, 2010)

    • Gerstner, Kistler, Naud and Paninski “Neuronal Dynamics: From single neurons to networks and models ofcognition” (Cambridge U. Press, 2014)