parameterization-invariant shape statistics and probabilistic classification of anatomical surfaces

12
Parameterization-Invariant Shape Statistics and Probabilistic Classification of Anatomical Surfaces Sebastian Kurtek 1 , Eric Klassen 2 , Zhaohua Ding 3 , Malcolm J. Avison 3 , Anuj Srivastava 1 1 Department of Statistics, Florida State University, Tallahassee, FL, 2 Department of Mathematics, Florida State University, Tallahassee, FL, 3 Institute of Imaging Science, Vanderbilt University, Nashville, TN Abstract. We consider the task of computing shape statistics and classification of 3D anatomical structures (as continuous, parameterized surfaces) under a Rie- mannian framework. This task requires a Riemannian metric that allows: (1) re- parameterizations of surfaces by isometries, and (2) efficient computations of geodesic paths between surfaces. These tools allow for computing Karcher means and covariances (using tangent PCA) for shape classes, and a probabilistic clas- sification of surfaces into disease and control classes. In a separate paper [13], we introduced a mathematical representation of surfaces, called q-maps, and we used the L 2 metric on the space of q-maps to induce a Riemannian metric on the space of parameterized surfaces. We also developed a path-straightening al- gorithm for computing geodesic paths [14]. This process requires optimal re- parameterizations (deformations of grids) of surfaces and achieves a superior alignment of geometric features across surfaces. The resulting means and co- variances are better representatives of the original data and lead to parsimonious shape models. These two moments specify a normal probability model on shape classes, which are then used for classifying test shapes. Through improved ran- dom sampling and a higher classification performance, we demonstrate the suc- cess of this model over some past methods. In addition to toy objects, we use the Detroit Fetal Alcohol and Drug Exposure Cohort data to study brain structures and present classification results for the Attention Deficit Hyperactivity Disorder cases and controls in this study. We find that using the mean and covariance struc- ture of the given data, we are able to attain a 88% classification rate, which is an improvement over a previously reported result of 82% on the same data. Keywords: Riemannian framework, parameterization invariance, shape statis- tics and models, classification, anatomical structures, ADHD 1 Introduction Shape is an important feature of anatomical objects and can be immensely useful in characterizing objects for the purpose of monitoring and characterization of a subject’s health. Studying shapes of 3D anatomical structures in the brain is of particular inter- est because many diseases can potentially be linked to alterations of these shapes. In this paper we are focused on shape analysis of parametrized surfaces of anatomical ob- jects, using a Riemannian framework that allows comparison, matching, deformation, averaging, modeling, and classification of observed shapes.

Upload: independent

Post on 20-Nov-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

Parameterization-Invariant Shape Statistics andProbabilistic Classification of Anatomical Surfaces

Sebastian Kurtek1, Eric Klassen2, Zhaohua Ding3, Malcolm J. Avison3, AnujSrivastava1

1 Department of Statistics, Florida State University, Tallahassee, FL,2 Department of Mathematics, Florida State University, Tallahassee, FL,

3 Institute of Imaging Science, Vanderbilt University, Nashville, TN

Abstract. We consider the task of computing shape statistics and classificationof 3D anatomical structures (as continuous, parameterizedsurfaces) under a Rie-mannian framework. This task requires a Riemannian metric that allows: (1) re-parameterizations of surfaces by isometries, and (2) efficient computations ofgeodesic paths between surfaces. These tools allow for computing Karcher meansand covariances (using tangent PCA) for shape classes, and aprobabilistic clas-sification of surfaces into disease and control classes. In aseparate paper [13],we introduced a mathematical representation of surfaces, calledq-maps, and weused theL2 metric on the space ofq-maps to induce a Riemannian metric onthe space of parameterized surfaces. We also developed a path-straightening al-gorithm for computing geodesic paths [14]. This process requires optimal re-parameterizations (deformations of grids) of surfaces andachieves a superioralignment of geometric features across surfaces. The resulting means and co-variances are better representatives of the original data and lead to parsimoniousshape models. These two moments specify a normal probability model on shapeclasses, which are then used for classifying test shapes. Through improved ran-dom sampling and a higher classification performance, we demonstrate the suc-cess of this model over some past methods. In addition to toy objects, we use theDetroit Fetal Alcohol and Drug Exposure Cohort data to studybrain structuresand present classification results for the Attention DeficitHyperactivity Disordercases and controls in this study. We find that using the mean and covariance struc-ture of the given data, we are able to attain a 88% classification rate, which is animprovement over a previously reported result of 82% on the same data.

Keywords: Riemannian framework, parameterization invariance, shape statis-tics and models, classification, anatomical structures, ADHD

1 Introduction

Shape is an important feature of anatomical objects and can be immensely useful incharacterizing objects for the purpose of monitoring and characterization of a subject’shealth. Studying shapes of 3D anatomical structures in the brain is of particular inter-est because many diseases can potentially be linked to alterations of these shapes. Inthis paper we are focused on shape analysis of parametrized surfaces of anatomical ob-jects, using aRiemannian framework that allows comparison, matching, deformation,averaging, modeling, and classificationof observed shapes.

2 Sebastian Kurtek, Eric Klassen, Zhaohua Ding, Malcolm J. Avison, Anuj Srivastava

There have been many different representations of surfaces. Several groups haveproposed methods for studying the shapes of surfaces by embedding them in volumesand deforming these volumes [9], [11]. While these methods are both prominent andpioneering in medical image analysis, they are typically computationally expensive.An alternative approach is based on manually-generated landmarks under the Kendallshape theory [7] and active shape models [5]. Others study 3Dshape variabilities usinglevel sets [15], curvature flows [10], or point cloud matching via the iterative closestpoint algorithm [1]. Also, there has been remarkable success in the use of medial rep-resentations for shape analysis, especially in medical image analysis, see e.g. [2], [8].

However, the most natural representation for studying shapes of 3D objects seems tobe using their boundary. In case of parameterized surfaces,there is an additional issue ofhandling the parameterization variability. Some papers, e.g. those using SPHARM [3]or SPHARM-PDM [18], tackle this problem by choosing a fixed parameterization. Alarge set of papers in the literature treat parameterization (or registration) as a pre-processing step [20], [4], [6]. In other words, they take a set of surfaces and use some en-ergy function, such as the entropy or the minimum description length to register pointsacross surfaces. Once the surfaces are registered, they arecompared using standardprocedures. There are several fundamental problems with this approach. Firstly, due toa registration procedure based on ensembles, the distance between any two shapes isdependent on the other shapes in the ensemble. Secondly, theregistration and the com-parison steps are typically disjoint and under different metrics. This certainly lacks theformalism needed to define proper distances and leads to sub-optimal registrations.

To the best of our knowledge, there are very few techniques inthe literature on aRiemannian shape analysis of parameterized surfaces that can provide geodesic pathsand be invariant to re-parameterization. To elaborate on this approach, letf1 andf2denote two surfaces;f1 andf2 are elements of an appropriate spaceF , which is madeprecise later, and let〈〈·, ·〉〉 be the chosen Riemannian metric onF . Then, under certainconditions, the geodesic distance between shapes off1 andf2 will be given by:

minγ,O

minF : [0, 1] → F

F (0) = f1, F (1) = O(f2 ◦ γ)

(∫ 1

0

〈〈Ft(t), Ft(t)〉〉(1/2)

dt

)

. (1)

(This assumes that translation and scaling variability hasalready been removed.) HereF (t) is a parameterized path inF , and the quantityL[F ] =

∫ 1

0 〈〈Ft(t), Ft(t)〉〉(1/2) dtdenotes the length ofF . The minimization inside the brackets, thus, denotes the prob-lem of finding a geodesic path (locally the shortest path) between the surfacesf1 andO(f2 ◦ γ), whereO andγ stand for an arbitrary rotation and re-parameterization off2, respectively. The minimization outside the bracket seeksthe optimal rotation andre-parameterization of the second surface so as to best match it with the first surface.In simple words, the outside optimization solves the registration problem while the in-side optimization solves for an optimal deformation (geodesic) and a formal distance(geodesic distance) between shapes. Thus, an important strength of this approach isthat theregistration and comparison are solved jointly rather than sequentially.Another strength is that this framework can be easily extended to different types ofsurfaces.

Parameterization Invariant Shape Statistics for Anatomical Surfaces 3

The rest of this paper is organized as follows. In Section 2, we present the frame-work and some examples of computing geodesics between toy surfaces, simple 3Dobjects and anatomical surfaces in the brain. In Section 3, we give a methodology forcalculating shape statistics of surfaces such as the Karcher mean and the covariance.Finally, in Section 4, we report ADHD classification resultsusing different techniques.

2 Novel Riemannian Framework

We will assume that the surfaces of interest are closed, i.e.no boundaries, and do nothave holes. We will represent a surface with its embeddingf : S

2 → R3. Let the

set of all such parameterized surfaces beF = {f : S2 7→ R

3|∫

S2‖f(s)‖2ds <

∞ andf is smooth}, whereds is the standard Lebesgue measure onS2. We choose

the natural Riemannian structure in the tangent space,Tf (F): for any two elementsm1,m2 ∈ Tf (F), define an inner product:〈m1,m2〉 =

S2〈m1(s),m2(s)〉 ds, where

the inner product inside the integral is the standard Euclidean product. The resultingL2

distance between any two pointsf1, f2 ∈ F is(∫

S2‖f1(s)− f2(s)‖2ds

)1/2. One can

represent surfaces as elements ofF as stated here and use theL2 distance to compare

shapes of surfaces. Although this framework is commonly used, it is not suitable for an-alyzing shapes of surfaces as it is not invariant to re-parameterizations. LetΓ be the setof all diffeomorphisms ofS2. This set will act as the re-parametrization group for sur-faces, with its action given by composition: for aγ ∈ Γ , f ∈ F , the re-parameterizedsurface is given byf ◦ γ. It is easy to see that‖f1 ◦ γ− f2 ◦ γ‖ 6= ‖f1− f2‖ in general,and that is a problem in usingL2 distances onF for shape analysis.

To define a new Riemannian metric, we first present a representation of surfacesfirst introduced in [12], [13]:

Definition 1 Define the mappingQ : F → L2 asQ(f)(s) =

‖a(s)‖f(s) , where‖a(s)‖ = ‖fx(s)× fy(s)‖ is the area multiplication factor off at s = (x, y) ∈ S

2.

Here‖ · ‖ denotes the standard 2-norm of a vector inR3. The factor‖a(s)‖ is the ratio

of infinitesimal areas of the surface atf(s) and the domain ats. For anyf ∈ F , wewill refer to q(s) ≡ Q(f)(s) as theq-map off . SinceF is a set ofsmoothsurfaces,the set of allq-maps is a subset ofL2. The action ofΓ on L

2, the space ofq-maps,is given by(q, γ) =

Jγ(q ◦ γ). An important fact about the mapQ is that if we re-parameterize a surface byγ and then obtain itsq-map (Definition 1), or if we obtain itsq-map (Definition 1) and then act byγ, the result will be the same.

We choose the naturalL2 metric on the space ofq-maps. That is, for any two el-ementsw1, w2 ∈ Tq(L

2), define an inner product:〈w1, w2〉 =∫

S2〈w1(s), w2(s)〉 ds.

The Riemannian metric that we will use onF is the pullback of theL2 metric from thespace ofq-maps. For this purpose, we first derive the differential ofQ at f , denoted byQ∗,f . This is a linear mapping between tangent spacesTf (F) andL2. For a tangentvectorv ∈ Tf (F), the mappingQ∗,f is given byQ∗,f(v) =

1

2‖a‖3

2

(a ·av)f +√

‖a‖ v.

We use this differential ofQ to define a Riemannian metric onF as follows.

Definition 2 For anyf ∈ F and anyv1, v2 ∈ Tf(F), define the inner product〈〈v1, v2〉〉f ≡〈Q∗,f (v1), Q∗,f(v2)〉, where the inner product on the right side is the standard innerproduct inL2.

4 Sebastian Kurtek, Eric Klassen, Zhaohua Ding, Malcolm J. Avison, Anuj Srivastava

Substituting forQ∗, we obtain an expression for〈〈v1, v2〉〉f :⟨

1

4‖a‖3 (a · av1)f, (a · av2)f⟩

+

1

2‖a‖ [(a · av2)v1 + (a · av1)v2], f⟩

+〈‖a‖v1, v2〉 .

An important property of this Riemannian metric is that the action of Γ on F is byisometries. With this induced metric,F becomes a Riemannian manifold and we wantto compute geodesic distances between two points, sayf1 andf2, in F .

2.1 Pre-Shape and Shape Space

Shape analysis of surfaces can be made invariant to certain global transformations bynormalizing. The translation of surfaces is easily taken care of by centering:fcentered(s) =

f(s) −∫S2

f(s)‖a(s)‖ds∫S2

‖a(s)‖ds. Scaling can be removed by re-scaling all surfaces to have unit

area,fscaled(s) = f(s)√∫S2

‖a(s)‖ds. With a slight abuse of notation, we define the space

of normalized surfaces asF . F forms the pre-shape space in our analysis. The remain-ing groups – rotation and re-parameterization – are dealt with differently, by removingthem algebraically from the representation space. The rotation groupSO(3) acts onF ,SO(3)× F → F according to(O, f) = Of and the re-parameterization groupΓ actson F , F × Γ → F with (f, γ) = (f ◦ γ). Since the actions ofSO(3) andΓ on Fcommute we can define an action of the product of the groups onF . The orbit of asurfacef is given by[f ] = {O(f ◦ γ)|O ∈ SO(3), γ ∈ Γ} and the set of closures ofall orbits is defined to beS.

The next step is to define geodesic paths inF andS. We start with the case ofF ;the geodesic distance between any two pointsf1, f2 ∈ F is given by

dF (f1, f2) = minF : [0, 1] → F

F (0) = f1, F (1) = f2

(

∫ 1

0

〈〈Ft(t), Ft(t)〉〉(1/2) dt) .

We will use a path- straightening approach for solving this problem. Once we have analgorithm for finding geodesics inF , we can obtain geodesics and geodesic lengths inS by solving an additional minimization problem overSO(3)× Γ , as stated in Eqn. 1.

2.2 Geodesics in the Pre-Shape SpaceF

Here we address the problem of finding geodesics between surfacesf1 andf2 in Fusing a path-straightening approach [14]. A similar approach for geodesics on shapespaces of closed curves was used in [17]. The basic idea here is to connectf1 andf2by any initial path, e.g. using a straight line under theL

2 metric, and then iteratively“straighten” it until it becomes a geodesic. This update is performed using the gradientof an energy function as described next.

LetF : [0, 1] → F denote a path inF . The energy of the pathF under the inducedmetric is defined to be:

E[F ] =

∫ 1

0

x

y

[1

4‖A‖3 (A·At)2(F ·F )+

1

‖A‖ (A·At)(Ft·F )+‖A‖(Ft·Ft)]dy dx dt .

Parameterization Invariant Shape Statistics for Anatomical Surfaces 5

In this expression we have suppressed the argumentt for all of the quantities. Also, weuseA(t) to imply a(F (t)). It is well known that a critical point ofE is a geodesic pathin F . To find a critical point, we are going to use the gradient∇EF which, in turn, isapproximated using directional derivatives,∇EF (G), whereG ∈ G is a perturbationof the pathF . The expression for∇EF (G) is available analytically. HereG denotesthe set of all possible perturbations ofF . We start with an initial pathF and iterativelyupdate it in the direction of∇E until we arrive at the critical pointF ∗, which is thedesired geodesic.

2.3 Geodesics in Shape SpaceS

Now, we consider the problem of finding geodesics between surfaces inS. This requiressolving an additional optimization over the product groupSO(3) × Γ by iterating be-tween the following two steps.

1. Rotation: We can use a gradient approach for this optimization, but instead wewill use an efficient albeit approximate technique based on Procrustes analysis. Fora fixedγ ∈ Γ , the minimization overSO(3) is performed as follows. Compute the3×3 matrixC =

S2f1(s)f2(s)

T ds. Then, using the singular value decompositionC = UΣV T , we can define the optimal rotation asO∗ = UV T (if the determinantof C is negative, the last column ofV T changes sign).

2. Re-Parameterization:In order to solve the optimization problem overΓ in Eqn.1, we will use a gradient approach. Although this approach has an obvious limita-tion of converging to a local solution, it is still general enough to be applicable togeneral cost functions. Additionally, we have tried to circumvent the issue of a localsolution by taking multiple initializations. This is similar to the gradient approachtaken in [12], [13]; the difference lies in the cost functionused for optimization. Inearlier papers, we address a problem of the typeminγ∈Γ ‖q1 − (q2, γ)‖2, whereq1andq2 areq-maps off1 andf2 and here we minimize a cost function of the typedF (f1, f2 ◦ γ)2, but the approach remains the same.

Computational Cost: We used the Matlab environment on an Intel Core 2 Duo CPU(2.66GHz, Windows XP). When we sample the surfaces with 2500points and use 1400basis elements for the path straightening optimization, the average computational costfor computing a geodesic inF (10 iterations) is 95s and inS is 490s.

Pre-Shape:E(F ∗) = 0.0163

Shape:E(F ∗) = 0.0046

Fig. 1. Geodesic comparison for shapes of surfaces with dual peaks.

6 Sebastian Kurtek, Eric Klassen, Zhaohua Ding, Malcolm J. Avison, Anuj Srivastava

Fig. 2. Geodesics for shapes of a hydrant and a heart (top) and a torsoand a bottle (bottom).

Fig. 3.Geodesics for left putamen and pallidus (top), and left caudate and right putamen (bottom).Left sides show rendered surfaces while the right sides showpolygon meshes.

In Figure 1, we display geodesics inF andS for toy surfaces with peaks at differ-ent locations. Along the geodesic in the shape space, the peaks smoothly move to thelocations of the peaks onf2. On the other hand, along the geodesic in the pre-shapespace, the peaks onf1 are contracted while the peaks onf2 are created. The differencein these geodesics is due to improved matching inS. It is important to note that thedecrease in the geodesic path energy fromF toS is significant. In Figure 2, we presentgeodesics inS for the shapes of some simple 3D objects. In each example, thegeodesicpath is smooth and natural due to improved feature matching.In Figure 3, we presentgeodesics in the shape space between anatomical structuresin the brain (left putamenand left pallidus, left caudate and right putamen). In the first example,f1 andf2 arequite similar and thus the evolution along the geodesic pathis more subtle than in theprevious examples. In the second example, the deformation is clearly visible.

3 Shape Statistics of Surfaces

In this section we will present tools for computing two important shape statistics – theKarcher mean and the covariance – for a set of surfaces.

3.1 Karcher Mean of Surfaces

To our knowledge there is no natural embedding of the shape spaceS inside a Hilbertspace. Thus, we cannot pursue the idea of computing extrinsic statistics for surfaces

Parameterization Invariant Shape Statistics for Anatomical Surfaces 7

and will use intrinsic statistics. For obtaining an intrinsic sample mean for a set ofsurfaces, we use the Karcher mean. For the given surfaces{f1, f2, . . . , fn} ∈ F , thesample Karcher mean is given by:f = argmin[f ]∈S

∑ni=1 d([f ], [fi])

2. Hered denotesthe length of the geodesic in the shape space between[f ] and [fi]. A gradient-basedapproach for finding the Karcher mean is given in [7] and is notrepeated here. We usethat algorithm for finding the Karcher mean shape of a set of surfaces.

Next, we present some examples of Karcher mean shapes using toy objects andanatomical surfaces. For comparison, we also displayf = (1/n)

∑ni=1 fi, i.e. without

any rotational or re-parameterizational alignment. For each example we show the de-crease in the gradient of the cost function during the computation of the Karcher mean.In the top part of Figure 4, we present means for ten unimodal surfaces with randompeak placements on a sphere. Thef surface has ten very small peaks at the locationsof the peaks in the sample. On the other hand, the mean inS has one peak, which is ofthe same size as all of the peaks in the sample. In this simple example one can clearlysee the effect of feature preservation due to rotational andre-parameterizational align-ment. In the middle part of Figure 4, we present mean shapes ofnine surfaces with dualpeaks. We note thatf has one peak aligned (at the location of the common peak in thesample) and one very wide and small peak. The mean inS has two peaks due to a crispalignment of peaks and thus is a much better representative of the sample. In the bottompart of Figure 4, we present results of mean computations forfive left putamen surfaceswith different parameterizations (different coordinate systems on the same surface).fdoes not have the shape of a left putamen. On the other hand, the f surface has the orig-inal correct shape. The optimization over the re-parameterization group plays a veryimportant role in this example.

3.2 Estimation of the Karcher Covariance

Once the sample Karcher mean has been computed, the evaluation of the Karcher co-variance is performed as follows. This task is difficult because: (1) althoughF is avector space, we are using a non-standard metric〈〈·, ·〉〉 and (ii) the shape space ofinterest is actually a quotient space ofF . To compute the covariance, we first findthe shooting vectors from the meanf to the shape orbits of each of the given sur-faces[fi]. That is, letνi = F ∗(0), whereF ∗(0) = f andF ∗(1) = O∗

i (fi ◦ γ∗i ),

i = 1, 2, . . . , n. We then perform principal component analysis by applying the Gram-Schmidt procedure (under the chosen metric〈〈·, ·〉〉), to generate an orthonormal ba-sis {Bj|j = 1, . . . , k}, k ≤ n, of the observed{νi} in the vector spaceTf(F). We

project each of the vectorsνi onto this orthonormal basis usingνi ≈ ∑kj=1 ci,jBj ,

whereci,j = 〈〈νi, Bj〉〉. Now each original surface can simply be represented usingthe coefficient vectorci = {ci,j}. Then, the covariance matrix can be computed in thecoefficient space usingK = (1/(n − 1))

∑ni=1 cic

Ti ∈ R

k×k. We can use the SVDof K to determine the principal directions of variation in the given data. For exam-ple, if u ∈ R

k corresponds to a principal singular vector ofK, then the correspondingtangent vector inTf(S) is given by

∑kj=1 ujBj . Hence, a concatenation of mappings

c 7→ ∑kj=1 cjBj 7→ f = expf (

∑kj=1 ujBj) provides a transformation of a coefficient

vectorc into a shapef . We can impose a shape model on a class by imposing a Gaus-

8 Sebastian Kurtek, Eric Klassen, Zhaohua Ding, Malcolm J. Avison, Anuj Srivastava

Sample

1 2 3 4 5 6 7

3

4

5

6

7

8

9

x 10−6

View 1 View 2 View 1 View 2Pre-Shape Mean Shape Mean Energy

Sample

1 2 3 4 5 6 7 8

1

2

3

4

5

x 10−5

View 1 View 2 View 1 View 2Pre-Shape Mean Shape Mean Energy

Sample

2 4 6 8

1.5

2

2.5

3

x 10−5

View 1 View 2 View 1 View 2Pre-Shape Mean Shape Mean Energy

Fig. 4. Mean computation for a sample of surfaces with one peak (top), dual peaks (middle), andleft putamen surfaces with random grid placement (bottom).

sian model on its coefficientsc ∼ N(0,K), whereK is the sample covariance matrixof that class.

We used the data and means presented in the middle example of Figure 4 to calculatethe principal directions of variation in the sample and to draw some random samplesfrom a Gaussian model mentioned above for both the pre-shapeand shape spaces. Theresults are displayed in Figure 5. Some of the samples drawn from the model in thepre-shape space are invalid as they have three rather than two peaks. On the other hand,in the shape space, all of the samples have two sharp peaks. This is due to improvedfeature matching inS. In the shape space, the principal directions of variation simplyreflect the placement of the peaks on each surface in the data.Next, we computed theprincipal directions of variation and some random samples based on a Gaussian modelfor the data in the top example of Figure 4. We show the resultsin the top portion ofFigure 6. In the last example, presented in the bottom portion of Figure 6, we used15 left pallidus structures. We note that our model is compact and all of the randomsamples are valid structures.

Parameterization Invariant Shape Statistics for Anatomical Surfaces 9

µ− 2σ → µ → µ+ 2σ Gaussian SamplesShape Space

Pre-Shape Space

Fig. 5.Principal directions of variation and random samples from aGaussian model.

4 ADHD Classification

In this section we present disease classification results for persons with Attention DeficitHyperactivity Disorder and controls. The clinical data sets used in this work were T1weighted brain magnetic resonance images of young adults ofages between 18 and 21.These subjects were recruited from the Detroit Fetal Alcohol and Drug Exposure Co-hort. Among the 34 subjects studied, 19 were diagnosed with ADHD and the remaining15 were controls (non-ADHD). We consider six different anatomical structures for clas-sification (L. and R. Pallidus, L. and R. Putamen, L. and R. Parietal Lobe). We chosethese structures based on previous literature that suggests that they are major players inADHD development [12], [16], [19].

We perform disease classification using several different distance functions. In thethird row of Table 1, we report the leave-one-out nearest neighbor (LOO NN) classifica-tion rate using all pairwise single structure distances (called d1) betweenq-maps [12].It is very expensive to compute the entire distance matrix and thus we would like to beable to perform classification using statistical summariessuch as the mean and covari-ance. In addition, the covariance structure of the sample may provide useful informationfor disease detection. Thus, for each of the six anatomical structures, we first computethe mean structure for the disease and control groups using the framework introduced inthis paper. In Figure 7, we display the means for all six anatomical structures for ADHDcases and controls. It is not easy to discern differences between the two means for eachstructure by looking at them. First, we classify each subject as a case or control basedon the geodesic distance to the disease and control means (called d2). We also form acovariance for each of the samples (six structures, two disease groups) in a leave-one-out manner using the orthonormal basis coefficients described in the previous section.Given this information, we can compute a covariance-adjusted distance to the mean

10 Sebastian Kurtek, Eric Klassen, Zhaohua Ding, Malcolm J.Avison, Anuj Srivastava

µ− 2σ → µ → µ+ 2σ Gaussian Samples

Fig. 6.Principal directions of variation and random samples inS from a Gaussian distribution forsurfaces with one peak (top) and left pallidus surfaces (bottom).

Disease Mean Healthy MeanDisease Mean Healthy MeanDisease Mean Healthy Mean

L. Pallidus L. Parietal Lobe L. Putamen

R. Pallidus R. Parietal Lobe R. Putamen

Fig. 7. Means for six anatomical structures in the ADHD study.

for each subject using the underlying covariance structureasd3 ≡ 12c

′K−1c c + ‖ν⊥‖

ǫ ,

whereν⊥ = ν −∑kj=1 cjBj andǫ = min(diag(K)). Once again we perform nearest

neighbor classification underd3. For comparison purposes, we also report classificationrates using the ICP (iterative closest point) algorithm andSPHARM-PDM under theleave-one-out nearest neighbor classifier. SPHARM-PDM failed to provide a parame-terization for the parietal lobe data and thus those classification rates are not reported.The results are presented in Table 1. The classification rates using (d1) are better thanthe ones using distances to the means only. Nonetheless, thedecrease in classificationrates using the mean only (d2) is not large in general and in the case of the right pallidusit actually increases. The best performance is attained when we use the mean and thecovariance structure of the given data. In fact, using this measure allows us to improvethe classification rates for most of the six structures, withthe left pallidus performingbest at 88.2%. We also see great improvement in the right putamen classification rate.In Figure 8, we show the benefits of using the covariance structure in disease classifica-

Parameterization Invariant Shape Statistics for Anatomical Surfaces 11

tion. Using the mean only distances (d2), the test shape (in red) is closer to the controlgroup, but after including the covariance structure in the distance calculation (d3), thistest shape is closer to the disease group. In addition, all ofour distance measures out-perform standard surface comparison techniques such as ICPor SPHARM-PDM.

f1

f2

Control

Disease

Fig. 8.Role of covariance in classification.

Table 1.ADHD classification rate using five different measures.

Distance Structure Pallidus Parietal Lobe PutamenType Side Left Right Left Right Left Right

LOO NN distance ICP(%) 67.6 55.9 61.8 67.6 61.8 47.1

LOO NN distance SPHARM-PDM(%) 44.1 52.9 - - 50.0 55.9

LOO NN shape distance d1(%) 76.5 61.8 70.6 67.6 82.4 67.6

Distance to mean d2(%) 70.6 64.7 67.6 67.6 70.6 64.7

Covariance-adjusted distance d3(%) 88.2 67.6 67.6 70.6 82.4 82.4

5 Conclusion

We have presented a Riemannian framework for parameterization-invariant statisticalanalysis of surfaces. Most previous methods view registration as a pre-precessing step,while one of our main contributions is to involve the registration directly in our frame-work. We have used a novel Riemannian metric for comparison of surfaces and com-putation of the first two statistical moments, the Karcher mean and Karcher covariance.We utilize this metric to demonstrate the computation of geodesics between real 3Dobjects, toy surfaces and anatomical shapes. In addition, we showcase the benefits ofour method by comparing statistical models developed within our framework to thosethat do not involve parameterization invariance. Finally,we use the computed statisticalsummaries for disease classification in a previously explored ADHD study, where weare able to significantly improve the rates by involving bothmean and covariance infor-mation. We find that the left pallidus structure provides thehighest disease classificationrate of 88.2%.Acknowledgements:This research was supported by AFOSR FA9550-06-1-0324,ONRN00014-09-1-0664, NSF DMS-0915003 (AS) and NIH/NIDA R21-DA021034 (MJA).

12 Sebastian Kurtek, Eric Klassen, Zhaohua Ding, Malcolm J.Avison, Anuj Srivastava

References

1. A. Almhdie, C. Leger, M. Deriche, and R. Ledee. 3D registration using a new implementa-tion of the ICP algorithm based on a comprehensive lookup matrix: Application to medicalimaging.Pattern Recognition Letters, 28(12):1523–1533, 2007.

2. Sylvain Bouix, Jens C. Pruessner, D. Louis Collins, and Kaleem Siddiqi. Hippocampal shapeanalysis using medial surfaces.NEUROIMAGE, 25:1077–1089, 2001.

3. C. Brechbuhler, G. Gerig, and O. Kubler. Parameterization of closed surfaces for 3D shapedescription.Computer Vision and Image Understanding, 61(2):154–170, 1995.

4. J. Cates, M. Meyer, P.T. Fletcher, and R. Whitaker. Entropy-based particle systems for shapecorrespondence. InMICCAI Mathematical Foundations of Computational Anatomy, pages90–99, 2006.

5. T.F. Cootes, C.J. Taylor, D.H. Cooper, and J. Graham. Active shape models - their trainingand application.Computer Vision and Image Understanding, 61(1):38–59, 1995.

6. R.H. Davies, C.J. Twining, T.F. Cootes, and C.J. Taylor. Building 3-d statistical shape modelsby direct optimization.IEEE Medical Imaging, 29(4):961–981, 2010.

7. I. L Dryden and K.V. Mardia.Statistical Shape Analysis. John Wiley & Son, 1998.8. K. Gorczowski, M. Styner, J.Y. Jeong, J.S. Marron, J. Piven, H.C. Hazlett, S.M. Pizer, and

G. Gerig. Multi-object analysis of volume, pose, and shape using statistical discrimination.IEEE Pattern Analysis and Machine Intelligence, 32(4):652–666, 2010.

9. U. Grenander and M. I. Miller. Computational anatomy: An emerging discipline.Quarterlyof Applied Mathematics, LVI(4):617–694, 1998.

10. X. Gu, S. Wang, J. Kim, Y. Zeng, Y. Wang, H. Qin, and D. Samaras. Ricci flow for 3D shapeanalysis. InIEEE International Conference on Computer Vision, 2007.

11. S.C. Joshi, M.I. Miller, and U. Grenander. On the geometry and shape of brain sub-manifolds.Pattern Recognition and Artificial Intelligence, 11:1317–1343, 1997.

12. S. Kurtek, E. Klassen, Z. Ding, S.W. Jacobson, J.L. Jacobson, M.J. Avison, and A. Srivas-tava. Parameterization-invariant shape comparisons of anatomical surfaces.IEEE MedicalImaging, 30(3):849–858, 2011.

13. S. Kurtek, E. Klassen, Z. Ding, and A. Srivastava. A novelRiemannian framework for shapeanalysis of 3D objects. InIEEE Computer Vision and Pattern Recognition, pages 1625–1632,2010.

14. S. Kurtek, E. Klassen, J.C. Gore, Z. Ding, and A. Srivastava. Elastic geodesic paths in shapespace of parametrized surfaces.IEEE Pattern Analysis and Machine Intelligence, In Review,2010.

15. R. Malladi, J.A. Sethian, and B.C. Vemuri. A fast level set based algorithm for topology-independent shape modeling.J. of Math. Imaging and Vision, 6:269–290, 1996.

16. A. Qiu, D. Crocetti, M. Adler, E. M. Mahone, M. B. Denckla,M. I. Miller, and S. H. Mostof-sky. Basal ganglia volume and shape in children with attention deficit hyperactivity disorder.Am J Psychiatry, 166(1):74–82, 2009.

17. A. Srivastava, E. Klassen, S. H. Joshi, and I. H. Jermyn. Shape analysis of elastic curves inEuclidean spaces.IEEE Pattern Analysis and Machine Intelligence, 99(PrePrints), 2010.

18. M. Styner, I. Oguz, S. Xu, C. Brechbuhler, D. Pantazis, J.Levitt, M.E. Shenton, and G. Gerig.Framework for the statistical shape analysis of brain structures using SPHARM-PDM. InMICCAI Open Science Workshop, 2006.

19. M.H. Teicher, C.M. Anderson, A. Polcari, C.A. Glod, L.C.Mass, and P.F. Renshaw. Func-tional deficits in basal ganglia of children with attention-deficit/hyperactivity disorder shownwith functional magnetic resonance imagery relaxometry.Nature Medicine, 6(4):470–473,2000.

20. O. van Kaick, H. Zhang, G. Hamarneh, and D. Cohen-Or. A survey on shape correspondence.Eurographics State-of-the-Art Report, 2010.