measuring functional integration: connectivity analyses
DESCRIPTION
Measuring Functional Integration: Connectivity Analyses. Roadmap to connectivity. Functional architecture of the brain?. Functional segregation : Univariate analyses of regionally specific effects. Functional integration : Multivariate analyses of regional interactions. - PowerPoint PPT PresentationTRANSCRIPT
Measuring Functional Integration:Measuring Functional Integration:Connectivity AnalysesConnectivity Analyses
Roadmap to connectivityRoadmap to connectivity
Functional architecture of the brain?Functional architecture of the brain?
Functional Functional segregationsegregation::
Univariate analyses of Univariate analyses of regionally specific effectsregionally specific effects
Functional Functional integrationintegration::
Multivariate analyses of Multivariate analyses of regional interactionsregional interactions
Functional Functional Connectivity:Connectivity:
“ “The temporal correlation between The temporal correlation between spatially remote neurophysiological spatially remote neurophysiological events”events”
An An operational/observational operational/observational definitiondefinition
Many possible reasons and mechanisms!Many possible reasons and mechanisms!
EffectiveEffective Connectivity Connectivity::
“ “The influence one neuronal system The influence one neuronal system exerts upon others” exerts upon others”
A A mechanistic/model-based mechanistic/model-based definitiondefinition
Context and mechanism of specific Context and mechanism of specific connections?connections?
OverviewOverview
Functional Connectivity:Functional Connectivity:– SVD/PCA SVD/PCA – Eigenimage analysis Eigenimage analysis – Problems of Eigenimage analysis Problems of Eigenimage analysis – Possible solutions: Partial least squares, ManCova + Canonical Variate Possible solutions: Partial least squares, ManCova + Canonical Variate
AnalysisAnalysis
Effective connectivity: Effective connectivity: – Basic concepts – linear vs nonlinear modelsBasic concepts – linear vs nonlinear models– Regression-based models: PPI, SEMRegression-based models: PPI, SEM– Modulatory influences at neuronal vs BOLD levelModulatory influences at neuronal vs BOLD level
Limitations of the presented methods and outlookLimitations of the presented methods and outlook
Aim: Aim: – To extract the To extract the structure structure inherent in the covariance of a series of repeated inherent in the covariance of a series of repeated
measurements (e.g., several scans in multiple voxels) measurements (e.g., several scans in multiple voxels) – SVD is identical to Principal Component Analysis!SVD is identical to Principal Component Analysis!– Neuroimaging: Neuroimaging:
• Which spatio-temporal patterns of activity explain most of the (co)variance in a Which spatio-temporal patterns of activity explain most of the (co)variance in a timeseries?timeseries?
Procedure: Decomposition of a Matrix YProcedure: Decomposition of a Matrix Ynnxxmm into: into:
– VVmmxxmm : “Eigenimages” : “Eigenimages” SPACE: expression of m patterns in m voxels SPACE: expression of m patterns in m voxels
– UUnnxxn n : “Eigenvariates”: “Eigenvariates” TIME: expression of n patterns in n scans TIME: expression of n patterns in n scans
– SSnxm nxm : “Singular Values”: “Singular Values” IMPACT: variance the patterns account for IMPACT: variance the patterns account for
““Eigenvalues”Eigenvalues” (squared) (squared) Only components with Eigenvalues > 1 need to be considered!Only components with Eigenvalues > 1 need to be considered!
The decomposition (and possible reconstruction) of Y:The decomposition (and possible reconstruction) of Y:[U,S,V] [U,S,V] = SVD(Y)= SVD(Y)
Y Y = USV= USVTT
Singular Value DecompositionSingular Value Decomposition
EigenimagesEigenimages
A time-series of 1D imagesA time-series of 1D images128 scans of 40 “voxels”128 scans of 40 “voxels”
Eigenvariates: Eigenvariates: Expression of 1st 3 EigenimagesExpression of 1st 3 Eigenimages
Eigenvalues and Spatial ModesEigenvalues and Spatial Modes
Y = USVY = USVTT = = ss11UU11VV11TT + + ss22UU22VV22
TT + ... + ... (p < (p < n!)n!)
U : “Eigenvariates”U : “Eigenvariates” Expression of p patterns in n scansExpression of p patterns in n scansS : “Singular Values” or “Eigenvalues” (squared)S : “Singular Values” or “Eigenvalues” (squared) Variance the p patterns account for Variance the p patterns account for V : “Eigenimages” or “Spatial Modes”V : “Eigenimages” or “Spatial Modes” Expression of p patterns Expression of p patterns
in m voxelsin m voxels
Data reduction Data reduction Components explain less and less variance Components explain less and less variance Only components with Eigenvalue >1 need to be included!Only components with Eigenvalue >1 need to be included!
SVD: Data ReconstructionSVD: Data Reconstruction
voxelsvoxels
Y (DATA)Y (DATA)
timetime
APPROX. APPROX. OF Y OF Y by P1by P1
UU11==APPROX. APPROX.
OF YOF Yby P2by P2
+ + ss22 + + ……ss11 UU22
VV11 VV22
EigenimagesEigenimages
A time-series of 1D imagesA time-series of 1D images128 scans of 40 “voxels”128 scans of 40 “voxels”
Eigenvariates: Eigenvariates: Expression of 1st 3 EigenimagesExpression of 1st 3 Eigenimages
Eigenvalues and Spatial ModesEigenvalues and Spatial Modes
The time-series “reconstructed”The time-series “reconstructed”
An example: PET of word generationAn example: PET of word generation
Word generation and repetition Word generation and repetition
PET Data PET Data adjusted adjusted for effects of for effects of interest and interest and smoothed smoothed with FWHM 16with FWHM 16
Two “Modes” extracted (Eigenvalue >1)Two “Modes” extracted (Eigenvalue >1)First Mode accounts for 64% of VarianceFirst Mode accounts for 64% of Variance
Spatial loadings (Eigenimages):Spatial loadings (Eigenimages):Establish Establish anatomical interpretation anatomical interpretation (Broca, ACC, …)(Broca, ACC, …)
Temporal loadings (Eigenvariates):Temporal loadings (Eigenvariates):establish functional interpretation (covariationestablish functional interpretation (covariationwith experimental cycle)with experimental cycle)
Problems of Eigenimage analysis IProblems of Eigenimage analysis I
Data-driven method:Data-driven method:– Covariation of patterns with experimental conditions not always dominant Covariation of patterns with experimental conditions not always dominant
Functional interpretation not always possible Functional interpretation not always possible
Partial Least SquaresPartial Least Squares
Data-driven method:Data-driven method:– Covariation of patterns with experimental conditions not always dominant Covariation of patterns with experimental conditions not always dominant
Functional interpretation not always possible Functional interpretation not always possible
Partial least squares:Partial least squares:– Apply SVD to the covariance of Apply SVD to the covariance of two two independent matrices with a similar temporal structureindependent matrices with a similar temporal structure
MMfmri fmri : timeseries, n scans x m voxels: timeseries, n scans x m voxels
MMdesign design : design matrix, n scans x p conditions: design matrix, n scans x p conditions
[U,S,V] [U,S,V] = = svd(svd(MMfmrifmriT T MMdesigndesign))
MMfmrifmriT T MMdesign design = = USVUSVTT
UUTTMMfmrifmriT T MMdesigndesignV V = = SS
– PLS identifies pairs of Eigenimages that show maximum covariancePLS identifies pairs of Eigenimages that show maximum covariance– Also applicable to timeseries from different regions/hemispheres!Also applicable to timeseries from different regions/hemispheres!
Problems of Eigenimage analysis IIProblems of Eigenimage analysis II
Data-driven method:Data-driven method:– Covariation of patterns with experimental conditions not always dominant Covariation of patterns with experimental conditions not always dominant
Functional interpretation not always possible Functional interpretation not always possible
No statistical inference:No statistical inference:– When is a pattern significantly expressed in relation to noise?When is a pattern significantly expressed in relation to noise?
ManCova / Canonical VariatesManCova / Canonical Variates
No statistical inference:No statistical inference:– When is a pattern significantly expressed in relation to noise?When is a pattern significantly expressed in relation to noise?
ManCova / Canonical Variates Analysis:ManCova / Canonical Variates Analysis:
– Multivariate Combination of SPM and Eigenimage analysisMultivariate Combination of SPM and Eigenimage analysis– Considers expression of eigenimages Y = US as data (m x p)Considers expression of eigenimages Y = US as data (m x p)– Multivariate inference about Multivariate inference about interactions interactions of voxels (variables), of voxels (variables), not not aboutabout
one voxelone voxel
ManCova / Canonical VariatesManCova / Canonical Variates
No statistical inference:No statistical inference:– When is a pattern significantly expressed in relation to noise?When is a pattern significantly expressed in relation to noise?
ManCova / Canonical Variates Analysis:ManCova / Canonical Variates Analysis:
Procedure:Procedure:1.1. Expression of eigenimages Y = US as data (m x p)Expression of eigenimages Y = US as data (m x p)
2.2. ManCova is used to test effects of interest on Y (smoothness in Y)ManCova is used to test effects of interest on Y (smoothness in Y)SStt = SS = SStreattreat SSrr=SS=SSres_model_with_treatres_model_with_treat SSoo = SS = SSres_model_without_treat res_model_without_treat λ = Sr / Soλ = Sr / So
3.3. CVA returns a set of canonical variatesCVA returns a set of canonical variatesCVs: CVs: linear combinations of eigenimages linear combinations of eigenimages explain most of the variance due to effects of interest explain most of the variance due to effects of interest in relation to errorin relation to error determined to maximise determined to maximise St / SrSt / Sr generalised Eigenimage solution: generalised Eigenimage solution: St / Sr c = c e St / Sr c = c e
c: p x c matrix of Canonical Variates; e = c x c matrix of their Eigenvaluesc: p x c matrix of Canonical Variates; e = c x c matrix of their Eigenvalues CVs can be projected into voxel space by CVs can be projected into voxel space by C = Vc C = Vc (“Canonical Images”)(“Canonical Images”)
Eigenimages Eigenimages vsvs Canonical Images Canonical Images
Eigenimage Eigenimage Canonical ImageCanonical Image
Captures the pattern that explains Captures the pattern that explains Captures the pattern that Captures the pattern that explainsexplains
most overall variance most overall variance most variance most variance in relation to in relation to errorerror
Problems of Eigenimage analysis: The End!Problems of Eigenimage analysis: The End!
Data-driven method:Data-driven method:– Covariation of patterns with experimental conditions not always dominant Covariation of patterns with experimental conditions not always dominant
Functional interpretation not always possible Functional interpretation not always possible
No statistical model:No statistical model:– When is a pattern truly expressed in relation to noise?When is a pattern truly expressed in relation to noise?
Patterns need to be orthogonalPatterns need to be orthogonal– Biologically implausible (interactions among systems and nonlinearities)Biologically implausible (interactions among systems and nonlinearities)
““Only” functional connectivity:Only” functional connectivity:– What drives the pattern? What drives the pattern?
• Uni- or polysynaptic connections? Common input from ascending reticular Uni- or polysynaptic connections? Common input from ascending reticular systems? Cortico-thalamico-cortical connections? Or even nuisance effects?systems? Cortico-thalamico-cortical connections? Or even nuisance effects?
OverviewOverview
Functional Connectivity:Functional Connectivity:– SVD/PCA SVD/PCA – Eigenimage analysis Eigenimage analysis – Problems of Eigenimage analysis Problems of Eigenimage analysis – Possible solutions: Partial least squares, ManCova + Canonical Variate Possible solutions: Partial least squares, ManCova + Canonical Variate
AnalysisAnalysis
Effective connectivity: Effective connectivity: – Basic concepts – linear vs nonlinear modelsBasic concepts – linear vs nonlinear models– Regression-based models: PPI, SEMRegression-based models: PPI, SEM– Modulatory influences at neuronal vs BOLD levelModulatory influences at neuronal vs BOLD level
Limitations of the presented methods and a possible solutionLimitations of the presented methods and a possible solution
Effective connectivity
the influence that one neural system exerts over anotherthe influence that one neural system exerts over another - how is this affected by experimental manipulationshow is this affected by experimental manipulations
considers the brain as a physical interconnected system
requires - an anatomical model of which regions are connected and- an anatomical model of which regions are connected and - a mathematical model of how the different regions interact
A mathematical model – but which one?example linear time-invariant system
regionx1
regionx2
Inputu1
Inputu2
c11 c22
a21
a12
a22a11
x = Ax + Cu Linear behaviour – inputs cannot influence intrinsic connection strengths
A – intrinsic connectivityC – inputs
e.g. state of region x1:x1 = a11x1 + a21x2 + c11u1
. .
A – intrinsic connectivityB – induced connectivityC – driving inputs
A mathematical model – a better one?Bilinear effects to approximate non-linear behavior
x = Ax + Bxu + Cu
Bilinear term – product of two variables (regional activity and input)
state of region x1:x1 = a11x1 + a21x2 + b21
2u2x2 + c11u1
regionx1
regionx2
Inputu1
Inputu2
c11 c22
a21
a12
a22a11
b212
. .
Linear regression models of connectivityPPI study of attention to motion
4 experimental conditions:F - fixation point onlyA - motion stimuli with attention (detect changes)N - motion stimuli without attentionS - static display
Attention
V1
V5
Hypothesis: attentional modulation of V1 – V5 connections
Linear regression models of connectivityPPI study of attention to motion
H0: betaPPI = 0
Corresponds to a test for a difference in regression slopes between the attention and no-attention condition
Bilinear term as regressor in design matrix
Linear regression models of connectivityStructural equation modelling (SEM)
y1
y3
y2
b12
b32b13
z1z2
z3
0 b12b13
y1 y2 y3 = y1 y2 y3 0 0 0 + z1 z2 z3
0 b320
y – time seriesb - path coefficientsz – residuals (independent)
Minimises difference between observed and implied covariance structure Limits on number of connections (only paths of interest) No designed input - but modulatory effects can enter by including bilinear terms as in PPI
Linear regression models of connectivityInference in SEM – comparing nested models
H0:b35 = 0
Different models are compared that either include or exclude a specific connection of interest
Goodness of fit compared between full and reduced model: - Chi2 – statistics
Example from attention to motion study: modulatory influence of PFC on V5 – PPC connections
Modulatory interactionsat BOLD versus neuronal level
HRF acts as low-pass filter especially important in high frequency (event-related) designs
Facit: either blocked designs or hemodynamic deconvolution of BOLD time series – incorporated in SPM2
Gitelman et al. 2003
Outlook – DCM
developed to account for different shortcomings of the presented methods
state-space model - experimentally designed effects drive the system or have modulatory effects
should allow to test more complex models than SEM
incorporates a forward model of neuronal -> BOLD activity More next week !