meg/eeg inverse problem and solutions in a bayesian framework

Post on 24-Feb-2016

160 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

MEG/EEG Inverse problem and solutions In a Bayesian Framework. ?. Jérémie Mattout Lyon Neuroscience Research Centre. With many thanks to Karl Friston , Christophe Phillips, Rik Henson , Jean Daunizeau. EEG/MEG SPM course, Bruxelles, 2011. Talk’s Overview. SPM rationale - PowerPoint PPT Presentation

TRANSCRIPT

MEG/EEG Inverse problem and solutionsIn a Bayesian Framework

EEG/MEG SPM course, Bruxelles, 2011

Jérémie MattoutLyon Neuroscience Research Centre

?

With many thanks to Karl Friston, Christophe Phillips,Rik Henson, Jean Daunizeau

Talk’s Overview

• SPM rationale- generative models- probabilistic framework- Twofold inference: parameters & models

• EEG/MEG inverse problem and SPM solution(s)- probabilistic generative models- Parameter inference and model comparison

Model: "measure, standard" ; representation or object that enables to describe the functionning of a physical system or concept

A model enables you to:- Simulate data- Estimate (non-observables) parameters- Predict future observations- Test hypothesis / Compare models

Stimulations

Physiological Observations

Behavioural Observations

A word about generative models

Model: "measure, standard" ; representation or object that enables to describe the functionning of a physical system or concept

A model enables you to:- Simulate data- Estimate (non-observables) parameters- Predict future observations- Test hypothesis / Compare models

MEG Observations (Y)

Auditory-Visual Stimulations (u)

Sources/Network ()Y = f(,u)

Model m: f, , u

A word about generative models

Probabilistic / Bayesian framework

Probability of an event:- represented by real numbers- conforms to intuition- is consistent

a=2b=5

a=2

• normalization:

• marginalization:

• conditioning :(Bayes rule)

Probabilistic modelling

MEG Observations (Y)

Auditory-Visual Stimulations (u)

Sources/Network ()Y = f(,u)

Model m: f, , u

Probabilistic modelling enables:- To formalize mathematically our knowledge in a model m- To account for uncertainty- To make inference on both model parameters and models themselves

MYP

MPMYPMYP

,,

PriorLikelihood

Marginal or Evidence

Posterior

MPMYPMYP ,

A toy example

MEG Observations (Y)

Y = L + ɛ

Model m:

- One dipolar source with known position and orientation.- Amplitude ?

Source gain vector

Source amplitude

Measurment noise

Linear f

Gaussian distributions

or &

Likelihood

Prior

A toy example

MEG Observations (Y)

Model m: &

𝑝 (𝜃|𝑌 )=𝑝 (𝑌|𝜃 )𝑝 (𝜃 )𝑝 (𝑌 )

𝛼𝑝 (𝑌|𝜃 )𝑝 (𝜃 )

Bayes rule

Posterior

Occam’s razor or principle of parsimony

Hypothesis testing: model comparison

𝑝 (𝜃|𝑌 )=𝑝 (𝑌|𝜃 )𝑝 (𝜃 )𝑝 (𝑌 )

Evidence 𝑝 (𝑌|𝑚 )=∫𝑝 (𝑌|𝜃 ,𝑚 )𝑝 (𝜃|𝑚 ) dθ« complexity should not be assumed without necessity »

mod

el e

vide

nce

p(y|

m)

space of all data setsy=f(x

)y

= f(

x)

x

Bayesian factor

0p Y H

1p Y H

Yspace of all datasets

• define the null and the alternative hypothesis H (or model m) in terms of priors, e.g.:

0 0

1 1

1 if 0:

0 otherwise

: 0,

H p H

H p H N

0

1

1P H yP H y

if then reject H0

• invert both generative models (obtain both model evidences)

• apply decision rule, i.e.:

y

Hypothesis testing: model comparison

Probabilistic framing

EEG/MEG inverse problemforward computation

Likelihood & Prior

inverse computation

Posterior & Evidence

𝑝 (𝑌|𝜃 ,𝑚 )𝑝 (𝜃|𝑚 )

𝑝 (𝑌|𝑚 )𝑝 (𝜃|𝑌 ,𝑚 )

Distributed/Imaging model

EEG/MEG inverse problem

Likelihood

,, LJNMYP LJY

Parameters : (J,)Hypothesis m: distributed (linear) model, gain matrix L, gaussian distributions

Prior ,0NJP

I2Sensor level # sources

# so

urce

s

IID(Minimum Norm)

Maximum Smoothness(LORETA-like)

Source level

# sensors

# se

nsor

s

Incorporating Multiple Constraints

EEG/MEG inverse problem

Likelihood

,, LJNMYP LJY

Paramètres : (J,,)Hypothèses m: hierarchical model, operator L + components C

Prior

,0NJPSource (or sensor) level

Multiple Sparse Priors (MSP)

kkQQ 1

1

log 𝑁 (𝛼 , 𝛽)

Expectation Maximization (EM) / Restricted Maximum Likelihood (ReML) / Free-Energy optimization / Parametric Empirical Bayes (PEB)

Estimation procedure

M-step

E-step

F

maxargˆ

),ˆ,(

maxarg)(ˆ)(

MYJp

FMJqMJq

MpqKLMYpMYpqKLMYpFq

,,log,,)(log

accuracy complexity

Iterative scheme

Model comparison based on the Free-energy

Estimation procedure

)()()|(ln McomplexityMaccuracyMYpF

model Mi

Fi

1 2 3

At convergence

At the end of the day

Somesthesicdata

- Pharmacoresistive Epilepsy (surgery planning):• symptoms• PET + sIRM• SEEG

Could MEG replace or at least complement and guide SEEG ?

Romain BouetJulien JungFrançois Maugière

Seizure

120 patients : MEG proved very much informative in 85 patients

30s

ExampleMEG - Epilepsy

Patient 1 : model comparison

MEG(best model)

SEEG

Example

Romain BouetJulien JungFrançois Maugière

MEG - Epilepsy

Patient 2 : estimated dynamics

temps

SEEGlésion occipitale

Romain BouetJulien JungFrançois Maugière

ExampleMEG - Epilepsy

Conclusion

The SPM probabilistic inverse modelling approach enables to:

• Estimate both parameters and hyperparameters from the data

• Incorporate multiple priors of different nature

• Estimate a full posterior distribution over model parameters

• Estimate an approximation to the log-evidence (the free-energy) which enables model comparison based on the same data

• Encompass multimodal fusion and group analysis gracefully

• Note that SPM also include a flexible and convenient meshing tool, as well as beamforming solutions and a Bayesian ECD approach…

Thank you for your attention

Graphical representation

EEG/MEG inverse problem

ΕJ

YL

( )jC ( )eC

( , )N 0 C ( , )N 0 C

...

( )ji

( )1eQ ( )

2eQ ...

( )ei

( )1

jQ ( )2

jQ

Fixed

Variable

Data

ΕJ

MEGY

MEGL

( )jC ( )1eC

( , )N 0 C ( , )N 0 C

( )1

jQ ( )2

jQ

( )ji

( )11eQ ( )

12eQ

( )eij

EEGY

EEGL

( )2eC

( )21eQ ( )

22eQ

Fusion of different modality

Incorporating fMRI priors

Hypothesis testing: inference on parametersFrequentist vs. Bayesian approach

t t y t *

0*P t t H

0p t H

0*P t t H if then reject H0

• estimate parameters (obtain test stat.)

H0 : 0• define the null, e.g.:

• apply decision rule, i.e.:

classical inference (SPM)

p y

0P H y

0P H y if then accept H0

• invert model (obtain posterior pdf)

H0 : 0• define the null, e.g.:

• apply decision rule, i.e.:

Bayesian inference (PPM)

top related