connectomics: parcellations and network analysis methods
DESCRIPTION
Simple tutorial on methods for functional connectome analysis: learning regions, extracting functional signal, inferring the network structure, and comparing it across subjects.TRANSCRIPT
Connectomics: Parcellation & Network Analysis MethodsGael Varoquaux INRIA, Parietal – Neurospin
Learning objectives
Chosing regions forconnectivity analysis
Extraction of thenetwork structure
Inter-subject comparisonof network structures
Varoquaux & CraddockNeuroImage 2013
Declaration of Relevant Financial Interests or Relationships
Speaker Name: Gaël Varoquaux
I have no relevant financial interest or relationship to disclose with regard to the subject matter of this presentation.
ISMRM20th ANNUAL MEETING & EXHIBITION
“Adapting MR in a Changing World”
Functional connectivity and connectomics
Fluctuations in functional imagingsignals capture brain interactions
Many pathologies are expressedby modified brain interactions
Need quantitative tools to developbiomarkers
Connectome based on regions toreduce number of connections studied
G Varoquaux 3
Connectomics: Problem setting and vocabulary
Infer and compareconnections between
a set of regions
Graph: set of nodes and connectionsWeighted or not.Directed or not.Can be represented by anadjacency matrix.
G Varoquaux 4
Connectomics: an outline
1 Functional parcellations
2 Signal extraction
3 Connectivity graphs
4 Comparing connectomes
G Varoquaux 5
1 Functional parcellationsDefining regions for connectomics
G Varoquaux 6
1 Need for functional parcellationsAnatomical atlases do not resolve functional structures
Harvard Oxford AAL
G Varoquaux 7
1 ClusteringGroup together voxels with similar time courses
... ... ...
... ...
Considerations– Spatial constraints – Number of regions – Running time
G Varoquaux 8
1 Clustering
Normalized cutsDownloadable atlasWith many parcelsbecomes a regular paving
Ward clusteringGood with many parcelsVery fastPython implementationhttp://nisl.github.io
G Varoquaux 9
1 Linear decomposition models
Cognitive networks are present at restTime courses
G Varoquaux 10
1 Linear decomposition models
Cognitive networks are present at restTime courses
Language
G Varoquaux 10
1 Linear decomposition models
Cognitive networks are present at restTime courses
Audio
G Varoquaux 10
1 Linear decomposition models
Cognitive networks are present at restTime courses
Visual
G Varoquaux 10
1 Linear decomposition models
Cognitive networks are present at restTime courses
Dorsal Att.
G Varoquaux 10
1 Linear decomposition models
Cognitive networks are present at restTime courses
Motor
G Varoquaux 10
1 Linear decomposition models
Cognitive networks are present at restTime courses
Salience
G Varoquaux 10
1 Linear decomposition models
Cognitive networks are present at restTime courses
Ventral Att.
G Varoquaux 10
1 Linear decomposition models
Cognitive networks are present at restTime courses
Parietal
G Varoquaux 10
1 Linear decomposition models
Cognitive networks are present at restTime courses
Observe a mixture
Need to unmix networksG Varoquaux 10
1 Linear decomposition models
Independent Component AnalysisExtracts networks
Downloadable atlas[Smith 2009]
Sparse dictionary learningNetworks outlined cleanlyBleeding edgeAtlas on request
G Varoquaux 11
1 Linear decomposition models
Independent Component AnalysisExtracts networks
Downloadable atlas[Smith 2009]
Sparse dictionary learningNetworks outlined cleanlyBleeding edgeAtlas on request
G Varoquaux 11
2 Signal extractionEnforce specificity to neural signal
G Varoquaux 12
2 Choice of regions
Too many regions givesharder statistical problem:⇒ ∼ 30 ROIs for
group-difference analysis
Nearly-overlapping regionswill mix signals
Avoid too small regions ⇒ ∼ 10mm radius
Capture different functional networksAutomatic parcellation do not solve everything
G Varoquaux 13
2 Time-series extraction
Extract ROI-average signal:weighted-mean with weightsgiven by grey-matter probability
Regress out confounds:- movement parameters- CSF and white matter signals- Compcorr: data-driven noise identification
[Behzadi 2007]- Global mean... overhyped discussion (see later)
G Varoquaux 14
3 Connectivity graphsFrom correlations to connections
Functional connectivity:correlation-based statistics
G Varoquaux 15
3 Correlation, covariance
1
For x and y centered:covariance: cov(x, y) =
1n
∑i
xiyi
correlation: cor(x, y) =cov(x, y)
std(x) std(y)Correlation is normalized: cor(x, y) ∈ [−1, 1]Quantify linear dependence between x and y
Correlation matrixfunctional connectivity graphs[Bullmore1996, Achard2006...]
G Varoquaux 16
3 Partial correlation
Remove the effect of z by regressing it outx/z = residuals of regression of x on z
In a set of p signals,partial correlation: cor(xi/Z, xj/Z), Z = {xk , k 6= i , j}partial variance: var(xi/Z), Z = {xk , k 6= i}
Partial correlation matrix[Marrelec2006, Fransson2008, ...]
G Varoquaux 17
3 Inverse covariance
K = Matrix inverse of the covariance matrix
On the diagonal: partial varianceOff diagonal: scaled partial correlation
Ki ,j = −cor(xi/Z, xj/Z) std(xi/Z) std(xj/Z)
Inverse covariance matrix[Smith 2011, Varoquaux NIPS 2010, ...]
G Varoquaux 18
3 Summary: observations and indirect effects
ObservationsCorrelation
0
1
2
3
4
Covariance:scaled by variance
Direct connectionsPartial correlation
0
1
2
3
4
Inverse covariance:scaled by partial variance
G Varoquaux 19
3 Summary: observations and indirect effects
ObservationsCorrelation
Direct connectionsPartial correlation
G Varoquaux 19
3 Summary: observations and indirect effects
ObservationsCorrelation
Direct connectionsPartial correlation
Global signal regressionMatters less on partial correlationsBut unspecific, and can make thecovariance matrix ill-conditioned
G Varoquaux 19
3 Inverse covariance and graphical model
Gaussian graphical modelsZeros in inverse covariance giveconditional independence
Σ−1i ,j = 0 ⇔ xi , xj independent
conditionally on {xk , k 6= i , j}
Robust to the Gaussian assumptionG Varoquaux 20
3 Partial correlation matrix estimation
p nodes, n observations (e.g. fMRI volumes)
If not n & p2,ambiguities:
(multicolinearity)
0
2
1
0
2
1 0
2
10
2
1
? ?
Thresholding partial correlations does not recoverground truth independence structure
G Varoquaux 21
3 Inverse covariance matrix estimationSparse Inverse Covariance estimators: Independence between
nodes makes estimation of partial correlation easier
0
1
2
3
4
Independencestructure + 0
1
2
3
4
Connectivityvalues
Joint estimation
Group-sparse inverse covariance: learn different connectomeswith same independence structure
[Varoquaux, NIPS 2010]
G Varoquaux 22
3 Inverse covariance matrix estimationSparse Inverse Covariance estimators: Independence between
nodes makes estimation of partial correlation easier
0
1
2
3
4
Independencestructure + 0
1
2
3
4
Connectivityvalues
Joint estimation
Group-sparse inverse covariance: learn different connectomeswith same independence structure
[Varoquaux, NIPS 2010]G Varoquaux 22
4 Comparing connectomesDetecting and localizing differences
Edge-level tests Network-level tests
G Varoquaux 23
4 Comparing connectomesDetecting and localizing differences
Edge-level tests
Network-level tests
G Varoquaux 23
4 Pair-wise tests on correlations
Correlations ∈ [−1, 1]⇒ cannot apply Gaussian
statistics, e.g. T tests
Z-transform:Z = arctanh cor = 1
2 ln 1 + cor1− cor
Z (cor) is normaly-distributed:For n observations, Z (cor) = N
Z (cor), 1√n
G Varoquaux 24
4 Indirect effects: to partial or not to partial?
0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Large lesion
Correlation matrices
0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Large lesion
Partial correlation matrices
Spread-out variability in correlation matricesNoise in partial-correlations
Strong dependence between coefficients[Varoquaux MICCAI 2010]G Varoquaux 25
4 Indirect effects versus noise: a trade off
0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Large lesion
Correlation matrices
0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Large lesion
Partial correlation matrices
0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Control0 5 10 15 20 25
0
5
10
15
20
25Large lesion
Tangent-space residuals[Varoquaux MICCAI 2010]
G Varoquaux 26
0 5 10 15 20 25
0
5
10
15
20
25
0 5 10 15 20 25
0
5
10
15
20
25
0 5 10 15 20 25
0
5
10
15
20
25
Edge-level tests Localization is hard (non-localeffects)
Multiple testing kills performance
G Varoquaux 27
0 5 10 15 20 25
0
5
10
15
20
25
0 5 10 15 20 25
0
5
10
15
20
25
0 5 10 15 20 25
0
5
10
15
20
25
Network-level tests Nodes clustertogether toform networks
G Varoquaux 27
4 Network-level metrics
Network-wide activityQuantify amount of signal in Σnetwork
Determinant: |Σnetwork|= generalized variance
Network integration: = log |ΣA|
Cross-talk between network A and BMutual information
= log |ΣAB| − log |ΣA| − log |ΣB|[Marrelec 2008, Varoquaux NIPS 2010]G Varoquaux 28
4 Pitfalls when comparing connectomes
Missing nodes
Very correlated nodes:e.g. nearly-overlapping regions
Hub nodes give more noisy partialcorrelations
G Varoquaux 29
Practical connectomics: take home messages
Need to choosefunctionally-relevent regions
Regress confounds out from signals
Partial correlations to isolatedirect effects
Networks are interesting unitsfor comparison
http://gael-varoquaux.info [NeuroImage 2013]
References (not exhaustive)[Achard 2006] A resilient, low-frequency, small-world human brain functional networkwith highly connected association cortical hubs, J Neurosci[Behzadi 2007] A component based noise correction method (CompCor) for BOLDand perfusion based fMRI, NeuroImage[Bullmore 2009] Complex brain networks: graph theoretical analysis of structuraland functional systems, Nat Rev Neurosci[Craddock 2011] A Whole Brain fMRI Atlas Generated via Spatially ConstrainedSpectral Clustering, Hum Brain Mapp[Frasson 2008] The precuneus/posterior cingulate cortex plays a pivotal role in thedefault mode network: Evidence from a partial correlation network analysis,NeuroImage[Marrelec 2006] Partial correlation for functional brain interactivity investigation infunctional MRI, NeuroImage[Marrelec 2008] Regions, systems, and the brain: hierarchical measures of functionalintegration in fMRI, Med Im Analys
References (not exhaustive)[Smith 2010] Network Modelling Methods for fMRI, NeuroImage[Smith 2009] Correspondence of the brain’s functional architecture during activationand rest, PNAS[Varoquaux MICCAI 2010] Detection of brain functional-connectivity difference inpost-stroke patients using group-level covariance modeling, Med Imag Proc CompAided Intervention[Varoquaux NIPS 2010] Brain covariance selection: better individual functionalconnectivity models using population prior, Neural Inf Proc Sys[Varoquaux 2011] Multi-subject dictionary learning to segment an atlas of brainspontaneous activity, IPMI[Varoquaux 2012] Markov models for fMRI correlation structure: is brain functionalconnectivity small world, or decomposable into networks?, J Physio Paris[Varoquaux 2013] Learning and comparing functional connectomes across subjects,NeuroImage