prostate boundary segmentation from 3d ultrasound images

13
Prostate boundary segmentation from 3D ultrasound images Ning Hu, Dónal B. Downey, Aaron Fenster, and Hanif M. Ladak Citation: Medical Physics 30, 1648 (2003); doi: 10.1118/1.1586267 View online: http://dx.doi.org/10.1118/1.1586267 View Table of Contents: http://scitation.aip.org/content/aapm/journal/medphys/30/7?ver=pdfcov Published by the American Association of Physicists in Medicine Articles you may be interested in Prostate CT segmentation method based on nonrigid registration in ultrasound-guided CT-based HDR prostate brachytherapy Med. Phys. 41, 111915 (2014); 10.1118/1.4897615 3D ultrasound image segmentation using wavelet support vector machines Med. Phys. 39, 2972 (2012); 10.1118/1.4709607 Automatic needle segmentation in three-dimensional ultrasound images using two orthogonal two-dimensional image projections Med. Phys. 30, 222 (2003); 10.1118/1.1538231 Prostate boundary segmentation from 2D ultrasound images Med. Phys. 27, 1777 (2000); 10.1118/1.1286722 Accuracy and variability assessment of a semiautomatic technique for segmentation of the carotid arteries from three-dimensional ultrasound images Med. Phys. 27, 1333 (2000); 10.1118/1.599014

Upload: hanif-m

Post on 28-Mar-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Prostate boundary segmentation from 3D ultrasound images

Prostate boundary segmentation from 3D ultrasound imagesNing Hu, Dónal B. Downey, Aaron Fenster, and Hanif M. Ladak Citation: Medical Physics 30, 1648 (2003); doi: 10.1118/1.1586267 View online: http://dx.doi.org/10.1118/1.1586267 View Table of Contents: http://scitation.aip.org/content/aapm/journal/medphys/30/7?ver=pdfcov Published by the American Association of Physicists in Medicine Articles you may be interested in Prostate CT segmentation method based on nonrigid registration in ultrasound-guided CT-based HDR prostatebrachytherapy Med. Phys. 41, 111915 (2014); 10.1118/1.4897615 3D ultrasound image segmentation using wavelet support vector machines Med. Phys. 39, 2972 (2012); 10.1118/1.4709607 Automatic needle segmentation in three-dimensional ultrasound images using two orthogonal two-dimensionalimage projections Med. Phys. 30, 222 (2003); 10.1118/1.1538231 Prostate boundary segmentation from 2D ultrasound images Med. Phys. 27, 1777 (2000); 10.1118/1.1286722 Accuracy and variability assessment of a semiautomatic technique for segmentation of the carotid arteries fromthree-dimensional ultrasound images Med. Phys. 27, 1333 (2000); 10.1118/1.599014

Page 2: Prostate boundary segmentation from 3D ultrasound images

Prostate boundary segmentation from 3D ultrasound imagesNing HuImaging Research Laboratories, The John P. Robarts Research Institute, London, Ontario N6H 5C1,Canada and Department of Electrical and Computer Engineering, University of Western Ontario, London,Ontario N6H 5C1, Canada

Donal B. DowneyImaging Research Laboratories, The John P. Robarts Research Institute, London, Ontario N6H 5C1,Canada, Department of Medical Biophysics, University of Western Ontario, London, Ontario N6H 5C1,Canada, and Department of Radiology, London Health Sciences Centre, London, Ontario N6H 5C1,Canada

Aaron FensterImaging Research Laboratories, The John P. Robarts Research Institute, London, Ontario N6H 5C1,Canada, Department of Radiology, London Health Sciences Centre, London, Ontario N6H 5C1, Canada,and Department of Medical Biophysics, University of Western Ontario, London, Ontario N6H 5C1,Canada

Hanif M. Ladaka)

Imaging Research Laboratories, The John P. Robarts Research Institute, London, Ontario N6H 5C1,Canada and Department of Medical Biophysics and Department of Electrical and Computer Engineering,University of Western Ontario, London, Ontario N6H 5C1, Canada

~Received 8 July 2002; accepted for publication 5 May 2003; published 20 June 2003!

Segmenting, or outlining the prostate boundary is an important task in the management of patientswith prostate cancer. In this paper, an algorithm is described for semiautomatic segmentation of theprostate from 3D ultrasound images. The algorithm uses model-based initialization and mesh re-finement using an efficient deformable model. Initialization requires the user to select only sixpoints from which the outline of the prostate is estimated using shape information. The estimatedoutline is then automatically deformed to better fit the prostate boundary. An editing tool allows theuser to edit the boundary in problematic regions and then deform the model again to improve thefinal results. The algorithm requires less than 1 min on a Pentium III 400 MHz PC. The accuracyof the algorithm was assessed by comparing the algorithm results, obtained from both local andglobal analysis, to the manual segmentations on six prostates. The local difference was mapped onthe surface of the algorithm boundary to produce a visual representation. Global error analysisshowed that the average difference between manual and algorithm boundaries was20.2060.28mm, the average absolute difference was 1.1960.14 mm, the average maximum difference was7.0161.04 mm, and the average volume difference was 7.16%63.45%. Variability in manual andalgorithm segmentation was also assessed: Visual representations of local variability were generatedby mapping variability on the segmentation mesh. The mean variability in manual segmentationwas 0.98 mm and in algorithm segmentation was 0.63 mm and the differences of about 51.5% ofthe points comprising the average algorithm boundary are insignificant (P<0.01) to the manualaverage boundary. ©2003 American Association of Physicists in Medicine.@DOI: 10.1118/1.1586267#

Key words: segmentation, three-dimensional ultrasound image, initialization, deformation, mesh

I. INTRODUCTION

Prostate cancer is the most commonly diagnosed malignancyin men over the age of 50, and is found at autopsy in 30% ofmen at the age of 50, 40% at age 60, and almost 90% at age90.1 Worldwide, it is the second leading cause of death dueto cancer in men, accounting for between 2.1% and 15.2% ofall cancer deaths.2 When prostate cancer is diagnosed at earlystage, it is curable,3 and even at later stages treatment can beeffective. Therefore, early diagnosis and precise staging ofprostate cancer are of primary importance.

Accurate and reproducible segmentation or outlining ofthe prostate boundary from 3D ultrasound images is an im-

portant first step when using images to support activitiessuch as diagnosis and monitoring of prostate cancer as wellas treatment planning and delivery. Manual contouring ofsequential cross-sectional prostate images is time-consumingand tedious, and hence, a number of algorithms have beendeveloped to segment the prostate boundary either automati-cally or semiautomatically. Most currently available algo-rithms focus on segmentation of 2D images. Prater andRichard4 used neural networks to segment the prostate; how-ever, this approach requires extensive training sets, isslow, and makes the addition of user-specified informationdifficult.

1648 1648Med. Phys. 30 „7…, July 2003 0094-2405 Õ2003Õ30„7…Õ1648Õ12Õ$20.00 © 2003 Am. Assoc. Phys. Med.

Page 3: Prostate boundary segmentation from 3D ultrasound images

Later, Richard et al.5 used the Laplacian-of-Gaussianedge operator followed by an edge-selection algorithm,which requires the user to select several initial points to forma closed curve. Their method correctly identified most of theboundary in a 2D ultrasound image of the prostate. An edge-based approach using nonlinear Laplace filtering was alsoreported by Aarninket al.6,7 Pathaket al.8 used edge guid-ance for deformable contour fitting in a 2D ultrasound imageand statistically demonstrated a reduction in the variability inprostate segmentation.

Richardet al.9 segmented the prostate boundary by usinga texture-based segmentation method, based on four textureenergy measures associated with each pixel in the image. Anautomated clustering procedure was used to label each pixelin the image with the label of its most probable class. Al-though good results were obtained for 2D ultrasound imagesof the prostate, the algorithm was computationally intensive,requiring about 16 min to segment the boundary of a prostatein 2D with a 90 MHz SUN SPARCstation.

Liu et al.10 presented an algorithm based on their radialbas-relief~RBR! edge detector. First the edges in the imagewere highlighted by RBR, and then binary processing andarea labeling were used to segment the boundary. Their re-sults showed that RBR performed well with a good-qualityimage, and the result was marginally satisfactory for poor-quality images. The RBR was able to extract a skeletonizedimage from an ultrasound image automatically. However,there were many spurious branches that created too muchambiguity to define the actual prostate boundary. Kwohet al.11 extended the RBR technique by fitting a Fourierboundary representation to the detected edges, resulting in asmooth boundary. Careful tuning of algorithm parameters isrequired when using this approach.

Knoll et al.12 proposed a technique for elastic deformationof closed planar curves restricted to particular object shapesusing localized multiscale contour parametrization based onthe wavelet transform. The algorithm extracted only impor-tant edges at multiple resolutions and ignored other informa-tion caused by noise or insignificant structures. This step wasfollowed by a template matching procedure to obtain an ini-tial guess of the contour. This wavelet-based method con-strained the shape of the contour to predefined models duringdeformation. They reported that this method provided astable and accurate fully automatic segmentation of 2D ob-jects in ultrasound and CT images. Shape information hasalso been used in the form of point distribution models tosegment the prostate.13

In our previous work,14 we reported on the developmentof an algorithm to fit the prostate boundary in a 2D imageusing the discrete dynamic contour with model-based initial-ization. A cubic interpolation function was used to estimatethe initial 2D contour from four user-selected points, whichwas then deformed automatically to better fit the prostateboundary. Unlike other 2D segmentation techniques, in ourapproach, the user was able to edit the contour to improve itsmatch to the prostate boundary.

However, diagnosis and therapy planning of prostate can-cer typically require the volume of the prostate and its shape

in 3D. Constructing a 3D prostate boundary from a sequenceof 2D contours when using manual outlining or the 2D tech-niques is tedious and time-consuming.

Recently, Ghaneiet al.15 have used a deformable model tosegment the prostate from 3D ultrasound images. Their ap-proach required the user to initialize the model by outliningthe prostate in 40%–70% of the 2D slices of each prostate,using six to eight vertices for each 2D contour, and then aninitial 3D surface was generated. The running time of thealgorithm was about 30 s on a SUN Ultra 20, but it was notclear whether the running time included outlining time. Theycompared algorithm and manual segmentation results bycomputing the ratio of the common pixels that were markedas prostate by both methods. The results showed an accuracyof nearly 89% and a three- to sixfold reduction in time com-pared to a totally manual outlining. No editing of the bound-ary was possible to improve the results.

In this paper, our objectives are to describe an alternativesemiautomatic 3D prostate segmentation approach and com-pare it to manual slice-by-slice outlining. Our 3D algorithmuses model-based initialization and a deformable model forboundary refinement. The algorithm requires the user to se-lect only six points on the prostate boundary to initialize a3D model and then a combination of internal and externalforces drives the model to the prostate boundary. Our ap-proach also allows the user to edit the boundary in problem-atic regions and then deform the model again to improve thefinal result. We have used our algorithm on six clinical casesand the accuracy and variability have been assessed by com-paring our semiautomatic algorithm results, obtained fromboth local and global analysis, to manually segmentedboundaries. In the following sections, the segmentation tech-nique is described in detail and the accuracy and variabilityof the technique are examined.

II. METHODS

A. Segmentation algorithm

The algorithm is based on a deformable model that isrepresented by a 3D mesh of triangles.16,17 The operation ofthe algorithm involves three major steps:~1! initialization ofthe 3D mesh;~2! automatic deformation to localize the pros-tate boundary; and~3! interactive editing of the deformedmesh.

1. Initialization

The user selects six control points (xic ,yi

c ,zic), i

51,2,...,6, on the ‘‘extremities’’ of the prostate in order toinitialize the mesh. Typically, we used the following controlpoints: four points in an approximate central transverse 2Dslice ~two points at the lateral extremes and two points at thetop and bottom of the prostate on the central axis!, one pointat the prostate apex and one point at the base. A typical 3Dultrasound image with user-selected control points is shownin Fig. 1~a!. An ellipsoid18 is estimated from the controlpoints, and is described by

1649 Hu et al. : Prostate boundary segmentation 1649

Medical Physics, Vol. 30, No. 7, July 2003

Page 4: Prostate boundary segmentation from 3D ultrasound images

r ~h,v!5F xyzG

5F a3cos~h!3cos~v!1x0

b3cos~h!3sin~v!1y0

c3sin~h!1z0

G ,2

p

2<h<

p

22p<v<p

,

~1!

where (x0 ,y0 ,z0) is the center of the ellipsoid anda, b, andc are the lengths of the semimajor axes in thex, y, and zdirections, respectively. This assumes that the length, width,and height of the prostate are approximately oriented alongthex, y, andz axes of the 3D image. The value ofx0 is takento be the average of thex coordinates of the two controlpoints with extremex values. Similarly,y0 andz0 are takento be the average of the two control points with extremeyand z values, respectively. The parametersa, b, and c arecomputed as half the distance between the two control pointswith extremex, y, and z values, respectively. The vectorr ~h,v! gives the position of a point on the ellipsoid, wherev

is the angle between thex axis and the projection of vectorrin the x–y plane, andh is the angle betweenr and its pro-jection on thex–y plane. The vector originates from thecenter of the ellipsoid (x0 ,y0 ,z0), and sweeps out the sur-face of the ellipsoid as the two independent parametersh andv change in the given intervals. The ellipsoid’s surface isthen represented by a mesh of triangles connecting thesepoints.19

The ellipsoid estimated in this manner usually does notpass exactly through the six control points, nor does it followthe prostate boundary well. In order to obtain a better fit toact as initial mesh for the deformation step, the ellipsoid iswarped using the thin-plate spline transformation, which isdescribed in the appendix. The transformation maps the sixends of the semimajor axes of the ellipsoid into the corre-sponding control points. Figure 1~b! shows the initial meshcorresponding to the 3D prostate image shown in Fig. 1~a!.

2. Deformationa. Dynamics. Our 3D deformable model is an extension

of the 2D model described by Lobregt and Viergever.20 Theposition of vertexi on the mesh at timet1Dt is calculatedfrom its position at time t using the following finite-difference equations:

pi~ t1Dt !5pi~ t !1vi~ t !Dt, ~2a!

vi~ t1Dt !5vi~ t !1ai~ t !Dt, ~2b!

ai~ t1Dt !51

mif i~ t1Dt !, ~2c!

wherepi5(xi ,yi ,zi)T is the vertex’s position andvi andai

are its velocity and acceleration, respectively.mi is its massand Dt is the time step. For simplicity, the mass of eachvertex and the time step are taken to be unity. The initialposition of each vertex is specified by the user defined initialmesh, and the initial velocity and acceleration are set to zero.f i is a weighted combination of internal (f i

int), external(f i

ext), and damping (f id) forces applied to each vertexi of

the mesh:20

f i5wiintf i

int1wiextf i

ext1f id . ~3!

The position, velocity and acceleration of each vertex arethen iteratively updated using Eqs.~2! and ~3!. Iterationscontinue until the mesh reaches equilibrium, which occurswhen the velocity and acceleration of each vertex becomeapproximately zero, i.e., whenivi i<e1 andiaii<e2 , wheree1 ande2 are two small positive constants close to zero. Bothe1 ande2 were set to 0.01. In cases where equilibrium cannotbe reached, deformation continues until a user specifiedmaximum number of iterations is reached. The maximumnumber was set to 40 in this study. For all six prostatessegmented in this work~see Sec. II B!, equilibrium could notbe reached after 40 iterations. The number of iterations toreach equilibrium depends critically on the choice ofe1 ande2 : More iterations are required when smaller values areused; however, the resulting mesh configuration satisfies therequirements for equilibrium more closely. We found that

FIG. 1. Operation of the 3D deformable model segmentation algorithm.~a!3D ultrasound image with five of the six user-selected control points shownas white spheres.~b! Initial mesh.~c! Final deformed mesh.

1650 Hu et al. : Prostate boundary segmentation 1650

Medical Physics, Vol. 30, No. 7, July 2003

Page 5: Prostate boundary segmentation from 3D ultrasound images

increasing the maximum to 200 iterations did lead to conver-gence within 150 iterations in all cases at the expense ofincreased computation time and diminishing gains in accu-racy. For instance, the difference in volume enclosed by thesurface at 40 iterations and after convergence was only 0.4%on average, and the maximum difference in the position ofpoints on the mesh was only 0.2%. The results after 40 itera-tions are adequate, and we chose to terminate the deforma-tion procedure after 40 iterations.

b. Forces. External forces drive vertices toward edges inthe ultrasound image and are defined in terms of a 3D po-tential energy function derived from the image:21

E~x,y,z!5i¹~Gs* I ~x,y,z!!i , ~4!

whereE represents the energy associated with a pixel in theimage having coordinates (x,y,z), Gs is a Gaussian smooth-ing filter with a characteristic width ofs, andI is the image.The ‘‘* ’’ operator represents convolution, ‘‘¹’’ is the gradientoperator, andi•i is the magnitude operator. Image-basedforces can be computed from the potential energy functionusing

f ext~x,y,z!5¹E~x,y,z!

maxi¹E~x,y,z!i . ~5!

The energy has a local maximum at an edge and the forcecomputed from the energy serves to localize this edge. Thedenominator in Eq.~5! serves to normalize the externalforces to the range@0,1#, which is the same as the range ofinternal forces. The external force at each vertex with coor-dinates (xi ,yi ,zi) is sampled from the image-based forcefield using bilinear interpolation:

f iext5f ext~xi ,yi ,zi !. ~6!

The external forcesf iext are vector quantities, having both

magnitude and direction. Only the component of the forcenormal to the surface is applied at the vertex since the tan-gential component can potentially cause vertices to bunch upduring deformation:

f iext5~ f i

ext"r i !r i , ~7!

where ‘‘•’’ denotes the vector dot product, andr i is the nor-mal vector at vertexi.

The internal force is the resultant surface tension at vertexi. It keeps the model smooth in the presence of noise and issimulated by treating each edge of the model as a spring.Surface tension at vertexi is defined as the vector sum ofeach normalized edge vector connected to the vertex:

f iint52

1

M (j 51

M ei j

iei j i, ~8!

whereei j 5pi2pj is a vector representing the edge connect-ing vertex i with coordinatepi to an adjacent vertexj withcoordinatepj , andM is the number of edges connected tovertex i, as shown in Fig. 2. Again, only the normal compo-nent is applied:

f iint5~ f i

int"r i !r i . ~9!

To prevent a vertex from oscillating during deformation, adamping force is applied to vertexi that is proportional to thevelocity vi at the vertex:

f id5wi

dvi , ~10!

wherewid is a negative weighting factor.

For simplicity, identical weighting factors were assignedto each vertex for all segmentations. The values forwi

img ,wi

int , andwid were set to 1.0, 0.2, and20.5, respectively. The

values were selected empirically; however, as a general rule,a larger value forwi

img compared towiint favors deformation

toward edges rather than smoothing due to internal forces.For noisier images,wi

int should be increased relative towiimg .

The choice ofwid appears to be less critical, and a small

negative value was found to provide good stability.Figure 1~c! shows the resulting mesh after the initial mesh

has been deformed to fit the 3D image of the prostate.

3. Editing

In some cases, the initialization procedure may placesome mesh vertices far from the actual prostate boundary.These vertices may not be driven toward the actual boundaryby the deformation process because image-based forces onlyact over short distances from edges. Thus, we have includedan editing procedure that allows the user to edit the mesh bydragging a point on the mesh to its desired location. Themesh can be re-deformed after editing. In order to keep themesh smooth when a point is dragged, points within a sphereof a user-defined radius centered at the displaced vertex areautomatically deformed using the thin-plate spline transfor-mation. The extent of smoothing is determined by the radiusof the sphere with a larger radius affecting more neighboringvertices. If necessary, the displaced points on the edited meshcan be clamped, i.e., will not move during deformation, andthe entire mesh can be re-deformed to obtain a better fit tothe actual prostate boundary.

FIG. 2. Schematic diagram showing the mesh geometry used to calculate thesurface tension at vertexi with coordinatespi on the mesh. Bold linesindicate all the edges attached to vertexi. This geometry is used to calculatethe internal force at vertexi (f i

int), which is given by the normal componentof the vector sum of each normalized edge vector connected to the vertex.

1651 Hu et al. : Prostate boundary segmentation 1651

Medical Physics, Vol. 30, No. 7, July 2003

Page 6: Prostate boundary segmentation from 3D ultrasound images

Figure 3 shows an example where the segmented surfacehad to be edited. For clarity, only 2D cross sections throughthe image and the mesh are shown, although both are threedimensional. Figure 3~a! shows that most of the initial modelclosely follows the actual prostate boundary except in areasindicated by the arrows. Figure 3~b! shows that after defor-

mation, the model follows the actual prostate boundary wellexcept in the two indicated areas where it was attracted toother edges. Figure 3~c! shows the mesh after the user editedthe two points indicated by the arrows. After re-deformation,the contour in Fig. 3~d! fits the prostate boundary well.

B. Prostate images

Six prostate images of patients who were candidates forbrachytherapy were acquired using a 3D transrectal ultra-sound ~TRUS!22 system developed in our laboratory. Thesystem consists of three elements: a conventional ultrasoundmachine with a biplane transrectal ultrasound transducer~Aloka 2000!; a microcomputer with a frame-grabber; and amotor-driven assembly to hold and rotate the transducer. Thetransducer probe was mounted in a probe holder assemblyand rotated around its long axis by a computer-controlledmotor. One-hundred 2D B-mode images were collected asthe transducer was rotated at constant speed through 180° in8 s. A 3D image was then reconstructed from this series of2D images and was ready for viewing about 1 s after theacquisition was completed.23–25

C. Evaluation of segmented boundaries

1. Manual segmentation

In order to evaluate the performance of the algorithm, thesurfaces segmented using the algorithm were compared tosurfaces manually outlined by a technician. The technicianwas trained by a radiologist, and has several years of expe-rience in analyzing ultrasound images of the prostate. Thetechnician is representative of users in a radiological oncol-ogy department. Manual segmentation was performed usinga multiplane reformatting image display tool.23 The 3D pros-tate images were resliced into set of transverse parallel 2Dslices along the length of the prostate at 2 mm intervals atmid-gland and at 1 mm intervals near the ends of the prostatewhere the prostate shape changes more rapidly from oneslice to the next. The number of 2D slices for the prostatesused in our segmentation studies ranged from 18 to 33 astabulated in Table I. The table also shows the total number ofpoints in each manual prostate segmentation; the total num-ber of points in the algorithm segmented meshes was 994.The prostate boundary was outlined in each slice, resulting ina stack of 2D contours, then the contours were tessellatedinto a 3D meshed surface using the NUAGES softwarepackage.26 Manual outlining of the prostates from the 2D

FIG. 3. A cross section of the 3D prostate image and segmented contour todemonstrate the editing process.~a! Initial outline, ~b! after first deforma-tion, ~c! two vertices~shown as arrows indicate! are edited to the actualboundary of the prostate,~d! after second deformation.

TABLE I. The number of 2D slices and the number of points in the manuallyoutlined prostate boundaries.

Prostate Number of slices Number of points

1 33 8132 18 3993 22 5074 32 7145 20 3886 18 377

1652 Hu et al. : Prostate boundary segmentation 1652

Medical Physics, Vol. 30, No. 7, July 2003

Page 7: Prostate boundary segmentation from 3D ultrasound images

slices required about 1–1.5 h for each prostate. Figure 4~a!shows an example of the stack of slices, and Fig. 4~b! showsthe mesh produced by manual segmentation for the sameprostate as shown in Fig. 1.

2. Accuracya. Local measures.The signed distancedj between each

point j ( j 51,2,...,N) on the algorithm segmented surface andthe corresponding point on the manually outlined surfacewas used as a local measure of the algorithm’s performance.As shown in Fig. 5, to computedj , the centroid of the algo-rithm segmented surface was first calculated. Radial lineswere then drawn from the centroid projecting outward toeach point on the algorithm segmented surface anddj wascomputed as the difference from a point on the algorithmsegmented mesh to the corresponding point on the manualoutline. The corresponding point was defined as the intersec-tion point of the radial line to the manual outline.dj is asigned value, and it is positive when the point on the algo-

rithm mesh is outside and negative otherwise. The absolutedifferenceudj u of each point was color mapped on the sur-face of the algorithm segmented mesh so it can provide avisual representation to show where the larger differencesbetween algorithm and manual segmentations occur.

b. Global measures.The global performance of the 3Dsegmentation algorithm was evaluated by computing themean difference~MD!, giving a measure of the segmentationbias; the mean absolute difference~MAD !, giving a measureof the segmentation error; and the maximum difference~MAXD !,14 giving a measure of maximum error betweenalgorithm and manual segmentations for each prostatek (k51,2,...,6):

MDk51

N (j 51

N

dj , ~11a!

MAD k51

N (j 51

N

udj u, ~11b!

MAXD k5maxudj u. ~11c!

The global performance of the algorithm was also as-sessed by calculating the percent difference,DVk%, in pros-tate volume computed from the manually segmented surface(Vm,k) and that computed from the algorithm segmented sur-face (Va,k) for prostate imagek (k51,2,...,6):

DVk%5Vm,k2Va,k

Vm,k3100. ~12!

Based on the measures of each prostate, the average MD,MAD, MAXD and DVk% of six prostates were calculated.These values are useful for algorithm development andoptimization.27

3. Variability

Since the algorithm required the user to initialize the pro-cess by manually selecting six points, variable final bound-aries may occur. Thus, we also studied the variability in thealgorithm segmentation and compared it to manual segmen-tation variability. One of the six prostate images was seg-mented ten times by the same technician using both manualand semiautomatic segmentation techniques. The variabilityin the manual and the semiautomatic segmentation was cal-culated using the following three steps:~1! determining anaverage segmentation mesh for the each set of ten meshes;~2! calculating the difference of each individual mesh to theaverage mesh; and~3! computing the standard deviation ofthe mesh distribution locally about the surface of the averageboundary.

After obtaining the average boundaries and local variabil-ity of both manual and algorithm boundaries, a t-test wasused to determine whether the difference of correspondingpoints between manual and algorithm segmentations was sig-nificant.

a. Generating the average mesh.In order to generate anaverage mesh from a set ofM meshes,x1 , x2 ,...,xM (M

FIG. 4. An example of a manually segmented prostate superimposed on asagittal section of the prostate.~a! The stack of 2D contours resulting frommanual outlining the prostate in sequential 2D slices.~b! The 3D meshgenerated from the stack of 2D contours.

FIG. 5. Comparing two boundaries by determining the differencedj from thecorresponding points. First the centroid of the algorithm boundary is deter-mined, and then a ray is projected from the centroid to pointj on thealgorithm boundary. The corresponding point on the manual boundary wasdefined as the point where the ray intersects with the manual boundary. Thedifference,dj , is the signed distance from the point on the algorithm bound-ary to the corresponding point on the manual boundary.

1653 Hu et al. : Prostate boundary segmentation 1653

Medical Physics, Vol. 30, No. 7, July 2003

Page 8: Prostate boundary segmentation from 3D ultrasound images

510 in this study!, the first meshx1 was taken as an initialmesh. For each vertexp1i ( i 51,...,N, whereN is the totalnumber of vertices in the mesh! of x1 , the correspondingpoints p2i , p3i ,...,pMi in the meshesx2 , x3 ,...,xM werefound by using the same method explained in Sec. II B 2.Note that the double subscript notation~e.g.,pj i ) in this sec-tion and the next two refers to vertexi of meshj. A vertex,pi

on the average meshx was determined by computing theaverage of all corresponding pointsp1i , p2i , p3i ,...,pMi :

pi51

M (j 51

M

pj i . ~13!

b. Difference to average mesh.The differencedji of thecorresponding point in each mesh inx1 , x2 ,...,xM to pi inthe average meshx was calculated, then the average differ-ence atpi in the average meshx was found using

di51

M (j 51

M

dji . ~14!

c. Standard deviation.For vertexpi on the average meshx, the standard deviation of the distribution of differenceswas calculated using

s i5A 1

M21 (j 51

M

~dji 2di !2. ~15!

The standard deviations i at each vertex on the average meshwas mapped on the average mesh surface to produce a visualdisplay of local variability.

III. RESULTS

A. Accuracy

The quality of the fit of the algorithm and manually seg-mented meshes to the actual boundary of the prostate can beseen in Fig. 6. Figure 6~a! shows the algorithm mesh in the3D ultrasound image, with~b! transverse,~c! coronal, and~d!sagittal cutting planes to show 2D cross-sectional images.Figure 6~b! shows a 2D transverse mid-gland slice of the 3Dimage with corresponding cross sections through the algo-rithm and manually segmented meshes superimposed. Figure6~c! shows a coronal section and Fig. 6~d! shows a sagittalsection with the corresponding cross section of the meshes.The manual and algorithm contours in Fig. 6 are similar toeach other and follow the prostate boundary well in regionswhere the contrast is high and the prostate is clearly sepa-rated from other tissues. In regions of low signal, the actualboundary is difficult to discern, and the manual and algo-rithm segmented outlines differ from each other, such as inregions near the bladder~indicated by the white arrow! andthe seminal vesicle~indicated by the black arrow!, as shownin Fig. 6~d!. The complete semiautomatic segmentation pro-cedure took less than 1 min on a Pentium III 400 MHz per-sonal computer, with about 20 s taken by the deformationalgorithm.

Figures 7~a! and 7~b! show two views of the algorithmsegmented surface for the prostate shown in Fig. 1, with theabsolute difference valuesudj u between the algorithm andmanual segmented meshes mapped on the surface. The sur-face is shaded so black regions correspond toudj u50 andwhite represents maximum difference, which in this case was

FIG. 6. Cross sections through the im-age in Fig. 1 showing the algorithmsegmentation~solid line! and manualsegmentation~dotted line!. ~a! 3D ul-trasound image of the prostate withtransverse, coronal and sagittal cuttingplanes indicated byb, c, andd, respec-tively, to show 2D cross-sectional im-ages.~b! Transverse cross section ofthe image and the boundaries corre-sponding to the plane shown in~a!. ~c!Coronal cross section of the image andthe boundaries.~d! Sagittal cross sec-tion of the image and the boundaries.

1654 Hu et al. : Prostate boundary segmentation 1654

Medical Physics, Vol. 30, No. 7, July 2003

Page 9: Prostate boundary segmentation from 3D ultrasound images

6.59 mm. Figure 7~a! shows a view of the prostate perpen-dicular to the transverse plane in the direction from the baseof the prostate to the apex, and Fig. 7~b! shows a view per-pendicular to the sagittal plane in the direction from the pa-tient right to left.

Table II lists the volumes calculated from the manuallyand algorithm segmented boundaries, andDVk% determinedby comparing the volumes of each prostate.DVk% is posi-tive for all prostates, indicating manually segmented bound-aries are larger than algorithm segmentations. Table II alsoshows the global accuracy measures, MD, MAD, andMAXD of the prostates. The average and standard deviationvalues of the global accuracy measures, MD, MAD, MAXD,andDV% are computed by averaging the individual valuesfor all six prostates. The average MD was close to zero~20.2060.28 mm!, but does vary over the surface of theprostate, becoming negative in some regions and positive inothers. The average MAD was 1.1960.14 mm, the averageMAXD was 7.0161.04 mm, and the average value ofDV%

was 7.1663.45. Note that the standard error of the mean~SEM! for volume measurements is less than that for cur-rently used approaches for prostate volume determination,suggesting that the analysis of six prostates is sufficient. Forinstance Tonget al.28 report an intra-operator SEM of 3.6cm3 using manual planimetry of 3D TRUS images, and 12.2cm3 for the height-width-length method.

B. Variability

Figures 7~c! and 7~d! show the local variability in themanually segmented boundaries, while Figs. 7~e! and 7~f!show the algorithm segmented boundaries. Figures 7~c! and7~e! show the same view direction of the prostate as in Fig.7~a!, and Figs. 7~d! and 7~f! are the same viewing directionas in Fig. 7~b!. In these figures, the standard deviation of thelocal boundary distribution is mapped on the average bound-ary with black regions indicating zero standard deviation andwhite regions representing maximum standard deviation,which was 3.0 mm for manual segmentation and 3.39 mmfor algorithm segmentation. The average standard deviationover the whole prostate surface in manual segmentations was0.9860.01 mm, whereas it was 0.6360.02 mm for the algo-rithm.

Figure 8 shows the result of the t-test comparing the localstandard deviation in manual segmentations to that in algo-rithm segmentations. The gray-levels on the average algo-rithm segmented boundary indicate the significances of dif-ferences between manual and semiautomatic segmentations:black regions correspond to the differences which werefound to be significant (P<0.01), while white regions cor-respond to regions where the differences were not signifi-cant. These results showed that differences of 51.5% of thepoints comprising the algorithm average boundary to manualaverage boundary were not significant.

FIG. 7. ~a! and ~b! The prostate boundary with absolute difference betweenmanual and algorithm segmented surfaces,udj u, mapped on surface. Blackindicates zero error and white is the maximum error of 6.59 mm as shownon the scale.~a! A view of the prostate perpendicular to the transverse planein the direction from the base to the apex, and~b! a view perpendicular tothe sagittal plane in the direction from the patient right to left.~c! and ~d!The local standard deviation mapped on the average manual segmentedprostate boundary.~e! and ~f! The local standard deviation mapped on theaverage algorithm segmented prostate boundary. Black regions indicate zerostandard deviation and white regions represent maximum standard devia-tion, which was 3.0 mm for manual and 3.4 mm for algorithm segmentation.~c! and~e! The same viewing direction of the prostate as in~a!, and~d! and~f! are the same viewing direction as in~b!.

TABLE II. Comparison between manual and algorithm segmentation for thesix prostates studied.DVk% is the volume difference between manual andalgorithm segmentations, MDk is the mean difference, MADk is the meanabsolute difference, and MAXDk is the maximum difference, of manual andalgorithm boundaries. Average and standard deviation of the global mea-sures are also shown. SEM is the standard error of the mean.

Prostatek

Volume ~cm3!

DVk%MDk

~mm!MAD k

~mm!MAXD k

~mm!Manual

Vm,k

AlgorithmVa,k

1 36.93 33.63 8.95 20.45 1.16 8.452 27.36 26.33 3.75 20.10 1.03 5.693 22.87 20.80 9.05 20.32 1.39 7.044 24.22 22.42 7.41 0.13 1.03 6.595 23.54 20.85 11.42 20.48 1.33 7.586 21.58 21.06 2.39 20.12 1.17 7.32

Average 26.08 24.18 7.16 20.20 1.19 7.01Standarddeviation

5.65 5.08 3.45 0.28 0.14 1.04

SEM 2.31 2.07 1.41 0.11 0.06 0.42

1655 Hu et al. : Prostate boundary segmentation 1655

Medical Physics, Vol. 30, No. 7, July 2003

Page 10: Prostate boundary segmentation from 3D ultrasound images

C. Editing

The number of editing operation was recorded for eachsegmented image. Prostate images 1 and 6 required two ed-iting operations, but the rest of the prostate images did notrequire editing.

D. Comparison of 2D and 3D segmentationalgorithms

The performance of the 3D segmentation algorithm wascompared to our previously published 2D segmentationalgorithm.14 Two-dimensional images were extracted fromprostate image 6 and segmented using the 2D algorithm, andthe 2D versions of the metrics MD, MAD, and MAXD asdescribed previously14 were computed with respect to thecorresponding manual outlines. Two-dimensional contourswere also extracted from the 3D meshes at the correspondinglocations and compared to the corresponding manual out-lines. Table III lists these values for the average of two slicesobtained at mid-gland and also for the average of two slicesobtained at the lateral margins of the prostate where partial

volume effects are significant. The results obtained at mid-gland using either algorithm are comparable to each other;however, there is a significant difference at the lateral mar-gins. In this area, the boundaries segmented using the 3Dalgorithm are closer to those segmented manually.

IV. DISCUSSION

A. 3D semiautomatic segmentation algorithm

The segmentation algorithm described in this paper ismore efficient in terms of time than manual segmentationbecause it uses global prostate shape information and there-fore requires little user input~six points! to rapidly initializethe model, followed by local refinement using an efficientdeformable model. On average, the technician involved inthis study spent 5–6 s to view the image and select initialpoints from the prostate boundary. The deformation proce-dure took about 20 s to localize the boundary. For results thatneeded to be modified by the user, editing could be finishedand re-deformation could on average be completed in about30 s. The software has not been optimized for efficiency, andthe whole algorithm currently requires less than 1 min on a400 MHz PC to segment a prostate boundary from a 3Dultrasound image, whereas manual segmentation usually re-quired 1–1.5 h.

The segmentation algorithm makes use of a 3D approachto complete the task by extracting the image informationfrom the 3D image directly. Although the manual segmenta-tion and 2D automatic or semiautomatic segmentation algo-rithms can provide a 3D representation of the prostate, se-lecting slices from the 3D image is arbitrary and controlledby the operator, which leads to increased variability. Further-more, the 2D segmentation approaches neglect correlationsin anatomy between neighboring slices. Determining theprostate boundary from only one 2D image is difficult whenshadows are present and at the apex and base of the prostate.The operator has to move back and forth frequently betweenslices to form an impression of the 3D anatomy of the pros-tate, a time-consuming procedure that may lead to error.

The initialization by the six points selected from the ex-tremes of the prostate can provide a good estimate to theprostate shape under most circumstances, which guides thedeformation to extract the actual prostate boundary. In prin-ciple, as few as four points should be required to estimate allof the parameters of the ellipsoid described by Eq.~1!: one at

FIG. 8. Results of t-test between manual and algorithm segmented bound-aries, mapped on the algorithm average mesh. Regions where the differenceis insignificant are shown in white and significant~P<0.01! regions areshown in black.~a! The same viewing direction of the prostate as in Fig.7~a!. ~b! The same viewing direction as in Fig. 7~b!.

TABLE III. Comparison of boundaries segmented using the previous 2Dsegmentation algorithm~Ref. 14! and the current 3D approach for prostateimage 6 using the 2D versions~Ref. 14! of the metrics MD, MAD, andMAXD. The results represent the average values for two 2D image slices atmid-gland and for the average of two slices at the lateral margins.

Slice

2D algorithm 3D algorithm

MD~mm!

MAD~mm!

MAXD~mm!

MD~mm!

MAD~mm!

MAXD~mm!

Mid-gland 0.40 2.09 5.40 0.23 1.94 5.20Margins 22.33 2.35 5.00 21.20 1.22 3.30

1656 Hu et al. : Prostate boundary segmentation 1656

Medical Physics, Vol. 30, No. 7, July 2003

Page 11: Prostate boundary segmentation from 3D ultrasound images

the center of the ellipsoid and one at the end of each axis.These points would directly give the parameters (x0 ,y0 ,z0),a, b, andc which are required in Eq.~1!. However, we foundthat selecting the center proved to be more difficult thanselecting the extremities. The six extreme points were easiestto locate. We found additional points~more than six! to beredundant. In cases where the prostate shape is very complexand difficult to represent with approximately six extremes, abetter mathematical presentation of the prostate shape isneeded.

B. Accuracy

The results of the algorithm were compared to manualsegmentation using surface and volume difference measures.It is difficult to conclude whether the difference between twomethods is caused by inaccuracies in semiautomatic segmen-tation, or by manual segmentation. Ideally, the performanceof the algorithm should be compared to a gold standard,which cannot be obtained when imaging prostates in pa-tients. Unlike some other applications, which can be testedusing phantoms, the existing phantoms of the prostate do notmimic the characteristics of the prostate and produce unreal-istically better ultrasound images than the actual prostates.

Both MD andDV% measures showed that the algorithmsegmented boundary is smaller than the manually segmentedboundary for prostates. The algorithm assumes that edges inthe image correspond to inflection points in the gray-levelprofile through the image. We observed that when the bound-ary is visible, most operators do not select the position of theboundary as corresponding to the inflection point, but tend toselect points on the brighter side of the inflection point, re-sulting in a larger manually segmented volume. This is ap-parent in Figs. 6~b! and 6~c! where the manually segmentedboundary~dotted! is generally larger than the algorithm seg-mented boundary. However, this was not observed in regionswhere the boundary is either difficult to discern or absent inthe [email protected]., areas indicated by the arrows in Fig. 6~d!#.

Comparing the average values of the measures MD,MAD, and MAXD to those previously published for our 2Dalgorithm, it would appear that the performance of the cur-rent 3D algorithm is worse than that of the 2D algorithm.Values previously reported for our 2D algorithm were for 2Dimage slices acquired at mid-gland, where the prostate shapeis approximately elliptical and our 2D algorithm is easy toinitialize, resulting in very accurate segmentation. Results forthe 3D algorithm include the ends of the prostate, and weexpect poorer performance due to partial volume effects if adirect comparison between these values is made.

C. Variability

As stated earlier, the segmentation algorithm has a lowervariability than manual segmentation. Although initializingand editing the model may introduce variability because ofuser intervention, the deformation will automatically trackthe edges without user intervention. It can be seen from Figs.7~e! and 7~f! that most regions of the algorithm segmentedboundary are nearly pure red, indicating low variability close

to zero. At those regions, the variability is not sensitive to theinitialization because the forces of the deformable model candrive the vertices to the desired edges, which are likely theactual prostate boundaries. At the region with high variabil-ity, i.e., mapped bluish color, the different initialization willlead to different segmentation because the low contrast at theedges cannot always attract vertices to the same location.The high variability reflects the high noise or missing edgesand segmentation at those regions is sensitive to the initial-ization. By performing the t-test, it was found that the differ-ences of about 51.5% of the mesh vertices in the averagealgorithm boundary to the average manual segmentation areinsignificant.

Editing is necessary even though it may introduce highervariability, because at regions where the actual prostateboundary contrast is either very weak or missing, user inter-vention is needed to guide the deformable model. In addi-tion, when the prostate shape is very irregular and initializa-tion cannot represent the prostate well, users can use theediting tool to change the model locally before the deforma-tion.

Variations in the parameterss, wiimg , wi

int , andwid affect

the segmented boundaries, and a different choice can poten-tially be an important source of variability in boundaryshape. A quantitative study of how the choice of parametersaffects segmentation results would involve varying all pa-rameters over the entire parameter space as well as varyingthe initialization points, and is beyond the scope of the cur-rent work. We are currently developing automatic proceduresfor testing the sensitivity of semiautomatic algorithms tovariations in parameters.

V. CONCLUSION

A semiautomatic algorithm was developed in this study.The algorithm is based on model-based initialization and lo-cal mesh refinement using a deformable model. The initial-ization procedure requires the user to select only six pointson the prostate boundary from the 3D image. An editing toolis provided to allow the user to modify the mesh locally afterdeformation. The algorithm segmentation requires less than 1min on a Pentium III 400 MHz personal computer.

The accuracy and variability of the algorithm have beenassessed by comparing to the manual results. Generally, thealgorithm segmented mesh and the manually segmentedmesh are similar, except at regions where the actual prostateboundary is missing or high noise is present, and the algo-rithm produces less variable results than the manual segmen-tation.

ACKNOWLEDGMENTS

The authors would like to thank Yunqiu Wang and Con-gjin Chen for technical assistance. Funding for this work wasprovided by the Canadian Institutes of Health Research~toA.F.!, the Ontario Research and Development ChallengeFund~to A.F.!, and by the Natural Sciences and EngineeringResearch Council of Canada~to H.M.L!. The third author

1657 Hu et al. : Prostate boundary segmentation 1657

Medical Physics, Vol. 30, No. 7, July 2003

Page 12: Prostate boundary segmentation from 3D ultrasound images

~A.F.! holds a Canada Research Chair Tier 1 and acknowl-edges the support of the Canada Research Chair Program.

APPENDIX

The thin-plate spline is a mathematical transformationbased on the physical feature of bending of a thin metalsheet. The transform29 provides a smooth mapping functionbetween two sets of homologous~source and target! points in3D:

~x,y,z!→~ f x~x,y,z!, f y~x,y,z!, f z~x,y,z!!, ~A1!

where f x(x,y,z), f y(x,y,z), and f z(x,y,z) are the compo-nents of the vector-valued thin-plate spline transformationfunction f (x,y,z) in the x, y, andz directions, respectively:

f ~x,y,z!5@a01a1x1a2y1a3z#

1F(i 51

n

wiU~ i~x,y,z!2Pi i !G . ~A2!

In Eq. ~A2!, thePi represent the source points. The first partof f (x,y,z) in square brackets is an affine transformationrepresenting the behavior off (x,y,z) at infinity, and the sec-ond part is the weighted sum of the 3D root functionU(r )5ir i . The root function is a fundamental solution of thebiharmonic equationD2U50, where

DU5S ]2

]x21

]2

]y21

]2

]z2D U.

U satisfies the condition of minimum bending energy. To findthe parameters in Eq.~A2!, the following matrix equationmust be solved:

~Wua!T5L21Y, ~A3!

whereW5(w1 ,w2 ,...,wn) are the parameters of the secondpart of f (x,y,z), a5(a0 ,a1 ,a2 ,a3) is the vector of affinetransformation parameters, andL5@PT

K0P# with

K5S 0 U~r 12! ¯ U~r 1n!

U~r 21! 0 ¯ U~r 2n!

] ] ] ]

U~r n1! U~r n2! ¯ 0

D , r i j 5iPi2Pj i ,

P5S 1 P1

1 P2

] ]

1 Pn

DandY5~Vu0 0 0!, whereV5(v1 ,v2 ,...,vn) is the vector ofthe coordinates of the target points.

a!Corresponding address: Department of Medical Biophysics, MedicalSciences Building, University of Western Ontario, London,Ontario N6H 5C1, Canada. Tel:~519! 661-2111 Ext. 86551; Fax:~519!661-2123; electronic mail: [email protected]. Garfinkel and M. Mushinski, ‘‘Cancer incidence, mortality, and sur-vival trends in four leading sites,’’ Stat. Bull. Metrop. Insur. Co.75,19–27~1994!.

2E. Silverberg, C. C. Boring, and T. S. Squires, ‘‘Cancer statistics,’’ Ca-Cancer J. Clin.40, 9–26~1990!.

3F. Lee, S. T. Torp-Pederson, and R. D. Mcleary, ‘‘Diagnosis of prostatecancer by transrectal ultrasound imaging,’’ Urol. Clin. North Am.16,663–673~1989!.

4J. S. Prater and W. D. Richard, ‘‘Segmenting ultrasound images of theprostate using neural networks,’’ Ultrason. Imaging14, 159–185~1992!.

5W. D. Richard, C. K. Grimmell, K. Bedigian, and K. J. Frank, ‘‘A methodfor 3D prostate imaging using transrectal ultrasound,’’ Can. J. Otolaryn-gol. 17, 73–79~1993!.

6R. G. Aarnink, R. J. B. Giesen, A. L. Huynen, J. J. M. C. H. de la Rosette,F. M. J. Debruyne, and H. Wijkstra, ‘‘A practical clinical method forcontour determination in ultrasonographic prostate images,’’ UltrasoundMed. Biol. 20, 705–717~1994!.

7R. G. Aarnink, A. L. Huynen, R. J. B. Giesen, J. J. M. C. H. de la Rosette,F. M. J. Debruyne, and H. Wijkstra, ‘‘Automated prostate volume deter-mination with ultrasonographic imaging,’’ J. Urol.~Baltimore! 153,1549–1554~1995!.

8S. D. Pathak, V. Chalana, D. R. Haynor, and Y. Kim, ‘‘Edge guideddelineation of the prostate in transrectal ultrasound images,’’ Proceedingsof the First Joint Meeting of the Biomedical Engineering Society andIEEE Engineering in Medicine and Biology Society, Atlanta, GA, Octo-ber 1999, p. 1056.

9W. D. Richard and C. G. Keen, ‘‘Automated texture based segmentationof ultrasound images of the prostate,’’ Comput. Med. Imaging Graph.20,131–140~1996!.

10Y. J. Liu, W. S. Ng, M. Y. Teo, and H. C. Lim, ‘‘Computerized prostateboundary estimation of ultrasound images using radial bas-reliefmethod,’’ Med. Biol. Eng. Comput.35, 445–454~1997!.

11C. K. Kwoh, M. Y. Teo, W. S. Ng, S. N. Tan, and L. M. Jones, ‘‘Outliningthe prostate boundary using the harmonics method,’’ Med. Biol. Eng.Comput.36, 768–771~1998!.

12C. Knoll, M. Alcaniz, V. Grau, C. Monserrat, and M. C. Juan, ‘‘Outliningof the prostate using snakes with shape restrictions based on the wavelettransform,’’ Pattern Recogn.32, 1767–1781~1999!.

13C. F. Arambula and B. L. Davies, ‘‘Automated prostate recognition: Akey process for clinically effective robotic prostatectomy,’’ Med. Biol.Eng. Comput.37, 236–243~1999!.

14H. M. Ladak, F. Mao, Y. Wang, D. B. Downey, D. A. Steinman, and A.Fenster, ‘‘Prostate boundary segmentation from 2D ultrasound images,’’Med. Phys.27, 1777–1788~2000!.

15A. Ghanei, H. S. Zadeh, A. Ratkewicz, and F. F. Yin, ‘‘A three-dimensional deformable model for segmentation of human prostate fromultrasound images,’’ Med. Phys.28, 2147–2153~2001!.

16J. D. Gill, H. M. Ladak, D. A. Steinman, and A. Fenster, ‘‘Accuracy andvariability assessment of a semiautomatic technique for segmentation ofthe carotid arteries from three-dimensional ultrasound images,’’ Med.Phys.27, 1333–1342~2000!.

17Y. Chen and G. Medioni, ‘‘Description of complex objects from multiplerange images using an inflating balloon model,’’ Comput. Vis. ImageUnderst.61, 325–334~1995!.

18F. Solina and R. Bajcsy, ‘‘Recovery of parametric models from rangeimages: The case for superquadrics with global deformation,’’ IEEETrans. Pattern Anal. Mach. Intell.12, 131–147~1990!.

19W. J. Schroeder, K. M. Martin, L. S. Avila, and C. C. Law,The VTKUser’s Guide~Kitware, New York, 1998!.

20S. Lobregt and M. A. Viergever, ‘‘A discrete dynamic contour model,’’IEEE Trans. Med. Imaging14, 12–24~1995!.

21T. McInerney and D. A. Terzopoulos, ‘‘A dynamic finite element surfacemodel for segmentation and tracking in multidimensional medical imageswith application to cardiac 4D image analysis,’’ Comput. Med. ImagingGraph.19, 69–83~1995!.

22S. Tong, D. B. Downey, H. N. Cardinal, and A. Fenster, ‘‘A three-dimensional ultrasound prostate imaging system,’’ Ultrasound Med. Biol.22, 735–746~1996!.

23A. Fenster and D. B. Downey, ‘‘3D ultrasound imaging: A review,’’ IEEEEng. Med. Biol. Mag.15, 41–51~1996!.

24A. Fenster, D. B. Downey, and H. N. Cardinal, ‘‘Three-dimensional ul-trasound imaging,’’ Phys. Med. Biol.46, R67–R99~2000!.

25T. R. Nelson, D. B. Downey, D. H. Pretorius, and A. Fenster,Three-Dimensional Ultrasound~Lippincott-Raven, Philadelphia, 1999!.

26B. Geiger, ‘‘Three-dimensional modeling of human organs and its appli-cation to diagnosis and surgical planning,’’ Institut National de Re-

1658 Hu et al. : Prostate boundary segmentation 1658

Medical Physics, Vol. 30, No. 7, July 2003

Page 13: Prostate boundary segmentation from 3D ultrasound images

chereche en Informatique et Automatique Report No. 2105, 1993.27F. Mao, J. Gill, and A. Fenster, ‘‘Segmentation of carotid artery in ultra-

sound images: Method development and evaluation technique,’’ Med.

Phys.27, 1961–1970~2000!.28S. Tong, H. N. Cardinal, R. F. McLoughlin, D. B. Downey, and A.

Fenster, ‘‘Intra- and interobserver variability and reliability of prostatevolume measurement via two-dimensional and three-dimensional ultra-sound imaging,’’ Ultrasound Med. Biol.24, 673–681~1998!.

29F. L. Bookstein, ‘‘Principal warps: Thin-plate splines and the decompo-sition of deformations,’’ IEEE Trans. Pattern Anal. Mach. Intell.11, 567–585 ~1989!.

1659 Hu et al. : Prostate boundary segmentation 1659

Medical Physics, Vol. 30, No. 7, July 2003