augmenting fetus perception through haptic interfaces

10
Internal Report 1202 University of Siena Augmenting Fetus Perception through Haptic Interfaces D. Prattichizzo a , F.M. Severi b , F. Barbagli a , C. Bocchi b , A. Vicino a , F. Petraglia b a Dipartimento di Ingegneria dell’Informazione, Universit` a di Siena, via Roma 56, 53100 Siena, Italia — Phone: +39 0577 234609, Fax: +39 0577 233602 E-mail: prattichizzo{barbagli,vicino}@dii.unisi.it b Dipartimento di Pediatria, Ostetricia e Medicina della Riproduzione, Universit` a di Siena, viale Bracci, 53100 Siena, Italia — Phone: +39 0577 233463, Fax: +39 0577 233454 E-mail: severi{bocchicate,petraglia}@unisi.it December 3, 2002 Abstract: Objective: The aim of this study is to design a Virtual Reality (VR) workstation for visual and kinesthetic interaction of parents and physicians with a 3D model of the fetus built through conventional ultrasound equipments. Methods: Haptic interface design and control is an emerging research area which allows physical interaction with virtual objects through the sense of touch. Starting from a sequence of 2D ultrasound images of the fetus, a VR model is built to allow visual and kinesthetic interaction with fetuses of 12 pregnancies. Results: A software package has been developed and then integrated with a haptic and visual interface to build a VR workstation whereby many tests have been executed on pregnancies who enjoyed the possibility of touching their future sons other than seeing their body features. Conclusions: The validation tests on pregnancies was exciting thus encouraging future research to improve the sense of realism involved in the fetus touch experience. Key Words: Ultrasound, 3D-US, Obstetric. Haptic Interface. Virtual Reality. A new application, based on virtual reality and robotics technologies, that allows users (parents and physicians) to interact with a digital model of a fetus has been investigated. In the last twenty years the ultrasound equipements have been massively used in gynecology and obstetrics [3] The success of ultrasonography is mainly due to its non invasive nature. The compliance, the rapidity and low costs of this technique, together with all the information that can be achieved from the detection of several morphologic and functional alterations involving both fetus and internal female genitalia, made this methodology a fundamental tool in the diagnosis of several gynecologic and obstetrics pathological 1

Upload: independent

Post on 11-Dec-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

Internal Report 1202 University of Siena

Augmenting Fetus Perceptionthrough Haptic Interfaces

D. Prattichizzoa, F.M. Severib, F. Barbaglia, C. Bocchib, A. Vicinoa, F. Petragliab

a Dipartimento di Ingegneria dell’Informazione, Universita di Siena,via Roma 56, 53100 Siena, Italia — Phone: +39 0577 234609, Fax: +39 0577 233602

E-mail: prattichizzo{barbagli,vicino}@dii.unisi.it

b Dipartimento di Pediatria, Ostetricia e Medicina della Riproduzione, Universita di Siena,viale Bracci, 53100 Siena, Italia — Phone: +39 0577 233463, Fax: +39 0577 233454

E-mail: severi{bocchicate,petraglia}@unisi.it

December 3, 2002

Abstract:Objective: The aim of this study is to design a Virtual Reality (VR) workstation for visualand kinesthetic interaction of parents and physicians with a 3D model of the fetus built throughconventional ultrasound equipments.Methods: Haptic interface design and control is an emerging research area which allows physicalinteraction with virtual objects through the sense of touch. Starting from a sequence of 2D ultrasoundimages of the fetus, a VR model is built to allow visual and kinesthetic interaction with fetuses of 12pregnancies.Results: A software package has been developed and then integrated with a haptic and visual interfaceto build a VR workstation whereby many tests have been executed on pregnancies who enjoyed thepossibility of touching their future sons other than seeing their body features.Conclusions: The validation tests on pregnancies was exciting thus encouraging future research toimprove the sense of realism involved in the fetus touch experience.

Key Words: Ultrasound, 3D-US, Obstetric. Haptic Interface. Virtual Reality.

A new application, based on virtual reality androbotics technologies, that allows users (parentsand physicians) to interact with a digital model ofa fetus has been investigated.

In the last twenty years the ultrasoundequipements have been massively used ingynecology and obstetrics [3] The success ofultrasonography is mainly due to its non invasive

nature.The compliance, the rapidity and low costs of

this technique, together with all the informationthat can be achieved from the detection of severalmorphologic and functional alterations involvingboth fetus and internal female genitalia, made thismethodology a fundamental tool in the diagnosisof several gynecologic and obstetrics pathological

1

2

conditions.

In this field there is nowadays a new imagingtechnique: three-dimensional ultrasound (3D-US). This imaging technology allows two-dimensional scans of the same object to begathered and displayed with spatial orientation, asa volume. The possible clinical applications of3D-US, not yet completely investigated, concernboth the evaluation of normal and abnormalfemale internal genitalia and physiological andpathologic development of the fetus duringpregnancy.

Medical ultrasound imaging is inherentlytomographic, thus providing all the informationnecessary for the 3D reconstruction. The typicallow-cost solution for 3D ultrasound consists ofan acquisition system which provides many 2Dultrasound scans whose relative position andorientation is assumed to be known. Typically3D reconstruction algorithms assume that theultrasound probe moves along a linear trajectorywith constant velocity. Other solutions such assensors placed on the US transducer [10, 12] or amechanical probe motion have been designed toenhance 3D reconstruction performance.

The aim of this study was to integrate theconventional ultrasound system in a VirtualReality environment. The ideal VR applicationis one in which the user is fully immersedin a digital world with which the operatorcan interact using senses in a highly realisticmanner. The applications for VR systems areconstantly increasing as medical applications.As an example many research activities dealwith VR-based systems employed in the processof training for special surgical techniques suchas the cholecystectomy [17], arthroscopy [8],endoscopy [6] and anesthesiology [4].

To the authors’ knowledge, however, thelink between the ultrasound imaging and theVR has not been deeply investigated yet.However, an integration of ultrasound techniques

in a VR context functional to the visual andkinesthetic interaction can improve the mother-fetus relationship during pregnancy and in thegynecological pathology, for instance, couldallow the better comprehension concerning theconsistency of pelvic masses.

Sight is the sense that has been studied mostextensively in VR systems. Computer Graphicshas been a field of research for decades anddigital images have reached an impressive level ofrealism. On the other side, the sense of touch is arecent and increasingly important field of researchin robotics.

Haptics (from Greek Haptesthai meaning of, orrelated to, touch) has become the standard termto qualify studies and technologies related to thesense of touch. As it was pointed out in [5], hapticfeedback relates two cognitive senses: the tactilesense that gives an awareness of stimulating thesurface of the body, and the kinesthetic sensethat provides information on body position andmovement. The former is based on receptors inthe skin which detect temperature, pressure andpain; the latter refers to receptors which detectforces applied to joints and muscles as well asjoint position.

While haptics, as defined above, refers to thesense of touch in its completeness, it is hard tofind devices capable of displaying both tactile andkinesthetic information to the user. In practice,the robotic research community normally refers todevices capable of kinesthetic feedback as hapticdevices while tactile devices are the ones capableof exerting tactile cues on the operator.

The aim of the present study consisted indeveloping a software tool that extracts fromconventional ultrasound systems a 3D digitalmodel of the fetus to be touched using a hapticdevice.

3

Materials and Methods

The visual and haptic 3D rendering ultrasoundsystem allows to gather image slides into volumesand then to process such volumes to reveal the3D fetus surface. To obtain a 3D ultrasoundreconstruction, 2D images are first segmentedand then processed to extract contours. Inthe literature, many techniques are available forcontour extraction, such as the “active contours”or “snakes” algorithms [1, 2, 11]. In this work the“Marching cubes” for 3D isosurface generationhas been used. The Marching cubes is a surfacefitting algorithm for visualizing 3D graphics. Itwas designed by Lorensen and Cline [7] to extractsurface information from 3D field of values. Thesurface is constructed according to the followingsimple principle: if a point, inside the desiredvolume has a neighboring point, outside thevolume, the isosurface must be between thesepoints. This analysis is performed at the voxellevel.

Note that the only surface extraction fromthe 3D volume of the fetus is not sufficient toestabilish a realistic haptic interaction. The fetussurface must be processed and arranged in away to guarantee the haptic interaction stability.Wrong mathematical models of the surface yieldto annoying vibration of the haptic interface thusdestroying any sense of realism in the kinestheticinteraction.

Starting from the 3D graphic model of thefetus, a VR model, which is then used for thekineasthetic interaction as well as the visioninteraction, has been developed. The forcesapplied by the operator on the fetus are computed,according to a visco-elastic model, throughgeometric techniques based on the 3D model ofthe fetus as well as the position of the hapticinterface. Such forces are fed to both the fetusmodel and to the haptic interface thus creating arealistic sensation of physical contact between theoperator and the fetus model itself.

The functional scheme of the visual and hapticinteraction is reported in Fig. 1 where the userinteracts with the 3D model of the fetus throughsight and touch. The surface model of the fetusis built from the 2D ultrasound scans passingthrough an image processing and segmentationstages. Visual and force signals are fed back to theoperator that can explore the overall VR model ofthe fetus by moving the haptic interface at his/herhand

The acquisition of the 2D ultrasound images,has been performed by a real time ultrasoundequipment with transabdominal convex proberunning ar 3.5MHz.

The haptic interface considered is referred to asan impedance displays [18] and is characterizedby low inertia and high back-drivability.

The most successful device to date is thePHANToM (Personal Haptic iNTerface Mecha-nism) (Fig. 2), which was originally developedby T. Massie and K. Salisbury at MIT in 1992 andsuccessively commercialized by SensAble Tech-nologies [16]. The base version of the PHAN-ToM is a serial device with three actuated degreesof freedom capable of exerting a direct force tothe extremity of the last link. The end-effector,a stylus that can be held by the operator or athimble for one’s fingertip, can be oriented freelyaround the center of a passive spherical joint. ThePHANToM has become the standard for the hap-tics community due to its strong performancesand to the fact that it has been the only three de-grees of freedom (DoF) commercially availabledesktop device for almost a decade.

At each time instant the haptic device detectsits joints positions using sensors and computesthe user hand position in the VR using the robotkinematics. The hand position in the VR is thenused to compute collision between the avatar,representing the user inside the VR and the fetusmodel. Interaction forces between user andvirtual object are computed and returned to the

4

haptic device [5, 9].

The haptic rendering algorithm has beendeveloped for a three DoF desktop device whichis able to simulate a single-point of contactinteraction with the fetus. One of the mainproblem approached is to calculate reaction forcesbetween the fetus model and the user. Analgorithm similar to the God-object or proxyalgorithm [19, 13] has been used. The God-objectis an algorithm which allows a very realisticgeneration of the forces arising from touching thefetus. The God-object- and proxy-like algorithmsowe their success to their simplicity and their highperformances.

To validate the overall work, the developedworkstation with visual and haptic feedback hasbeen tested by making 12 pregnancies interactingwith their fetuses thorugh sight and touch senses.

Results

The developed VR system is a software toolwhich takes as input the 2D ultrasound scansand gives as output a 3D VR model of thefetus able to render haptic and visual feedback.The software package is then integrated withthe haptic interface PHANToM and a computergraphic workstation for visual feedback to buildthe VR workstation in Fig. 2.

The software package for the visual and haptic3D rendering, has been designed in C++ in anobject oriented setting and is portable on manyplatforms (e.g. Windows and Unix). The 3Drendering system allows to gather 2D image slidesinto volumes and then to process such volumes toreveal the 3D fetus surface. The image processingand visualization procedures have been developedwith the Visualization Toolkit [14]. The softwareapplication takes as input the ultrasound 2Dscans and the position of the ultrasound probeif available, otherwise a constant velocity ofthe acquisition probe is assumed to process the

sequence of 2D ultrasound scans. An interestingfeature of our VR system is that it is able toprocess 2D ultrasound scans in standard formats(such as DICOM) thus allowing to use oursystem with most of the mass market ultrasoundequipements.

The ultrasound system that actually workswith our VR workstation is the Sonoline ElegraMillenium Edition by Siemens Medical SolutionsUSA. To obtain a 3D ultrasound reconstruction,2D images are first segmented and then processedwith the “Marching cubes” algorithm to obtain a3D isosurface generation [7]. Fig. 3 is the screen-shot of the main user interface of the developedsoftware. All the information on voxel dimensionare reported in the lower-right part of Fig. 3.

Starting from the sequence of 2D ultrasoundimages in the 2nd upper window of Fig. 3, thesystem is able to gather the fetus volume andin particular the two planar sagittal and coronalprojections are obtained from the 3D volume asin Fig. 3. The three (coronal, sagittal and axial)planes are rendered all together in the 1st lower-left window of Fig. 3.

One of the most important problems encoun-tered in ultrasound systems is that the image qual-ity is poor. Due to the nature of acoustic propaga-tion in biological tissue the images are corruptedby speckle noise, shadow and distortion effects,multiple echoes and others. Then advanced im-age processing techniques are required. In oursystem, many tools are available to refine the seg-mentation and the filtering phase of the 2D ultra-sound scans. The user can adjust the standard de-viation and radius parameters for filtering actionsand also some brightness parameter (lower part ofFig. 3)

For a particular sequence of 2D ultrasoundscans, the system generated the VR modelreported in Fig. 4. This surface model is then usedfor haptic and visual interaction with the user in asystem layout as that described in Fig. 2.

5

It is worthwhile to mention that the hapticinteraction may involve a loop stability problem.In fact, the performance of the haptic renderingalgorithm is strongly dependant on the numberof polygons constituting the surface model ofthe fetus and on the particular type of collisiondetection algorithms being used. Tools to managethese occurrences have been included in thesoftware package as shown the lower part ofFig. 4.

Discussion

The early mother infant relationship is of criticalimportance, because it forms the basis for thechild’s future social, emotional and cognitivedevelopment.

There is evidence, suggesting that maternalsensitivity and affection have their originsin pregnancy. Theoretical accounts of thepsychology of pregnancy suggest that there is agrowing affection for and relationship with theunborn child during pregnancy. This relationshipincreases gradually during the progress ofpregnancy. However, a rapid increase is observedafter the first perception of fetal movements [15].

Guess how touching your own fetus can allowto a growing affectivity, although through avirtual way, during each ultrasound examination.This new methodological approach not onlyseems to improve the mother-fetus relationship,but also throws light upon our future possibilitiesof understanding the physical shape of fetal tissueon the basis of their ultrasound characteristics.

Work is in progress to improve the sense ofrealism involved in the fetus touch experience.And in particular, work is in progress to attach thefetus texture to the VR model for visual renderingand to add a realistic friction sense while slidingalong the fetus surface.

Future developments of the VR workstationwill allow the user to feel the heart beat of the

unborn child during the pregnancy. This functionwill be implemented by transforming the directmeasure of the heart beat into a force signal to beactuated by the haptic interface.

More interestingly, we are also trying toestimate the fetus compliance from the voxel grayscale of the ultrasound images. Such estimationwill be used to set the visco-elastic parametersof the 3D VR model for visual and hapticinteraction.

Acknowledgments

The authors would like to thank the studentBerardino La Torre of the Engineering Faculty(Siena) for his invaluable support in developingthe system and Prof. Santina Rocchi for herconstructive suggestions. Finally, the authorswould like to thank all the pregnant women whosupported us patiently during this study and forpermission to publish the paper data.

References

[1] H.H.S. Ip ans R. Hanka and T. Hongying.Segmentation of the aorta using temporalactive contour model with regularizationscheduling. In Proc. Int. Soc. Opt. Eng.,volume 3034, 1997.

[2] A. Yezzi ans S. Kichenssamy ans A.Kumar ans P. Olver and A. Tannenbaum. Ageometric snake model for segmentation ofmedical imagery. IEEE Trans. Med. Imag.,16, 1997.

[3] K. Baba. Basis and principles of three di-mensional ultrasound. In Three dimensionalultrasound in Obstetrics and Ginecology,pages 1–20. Carnforth: Parthenon Publish-ing, 1997.

[4] D. Blezek, R. Robb, J. Camp, L. Nauss,and D. Martin. Simulation of spinal nerve

6

blocks for training anesthesiology residents.In SPIE Conf. on Surgical-Assist Systems,San Jose, CA, 1999.

[5] V. Hayward and O. Astley. Performancemeasures for haptic interfaces. In Eds. Gi-ralt G., Hirzinger G., editor, 1996 RoboticsResearch: The 7th International Sym-posium, pages 195–207. Springer Verlag,1996.

[6] K. Ikuta, M. Takeichi, and T. Namiki. Vir-tual endoscope system with force sensation.In W. Wells, A. Colchester, and S. Delp, ed-itors, Proc. Int. Conf. Medical Image Com-put. assisted Intervention. Springer, Berlin,1998.

[7] W.E. Lorensen and H.E. Cline. Marchingcubes: a high resolution 3d surface con-struction algorithm. Computer Graphics,21(3):163–169, 1987.

[8] A. McCarthy, P. Harley, and R. Smallwood.Virtual arthroscopy training: do the virtualskills developed match the real skillsrequired? Medicine Meets Virtual Reality,7, 1999. IOS Press.

[9] J.B. Morrell and J.K. Salisbury. Perfor-mance measurements for robotic actuators.In Proceedings of the ASME Dynamics Sys-tems and Control Division, volume DSC-Vol. 58, 1996.

[10] N. Pagulatos, W.S. Edwards, D.R.Hajnor,and Y. Kim. Interactive 3-d registration ofultrasound and magnetic resonance imagesbased on a magnetic posistion sensor. IEEETrans. Inf. Techn. in Biomed., 3(4):278–288,Dec. 1999.

[11] N. Pagulatos, W.S. Edwards, D.R.Hajnor,and Y. Kim. Interactive 3-d registration ofultrasound and magnetic resonance images

based on a magnetic posistion sensor. IEEETrans. Inf. Techn. In Biomed., 3(4), 1999.

[12] R.W. Prager, A.H Gee, and L. Stradx.Stradx: real-time acquisition and visualiza-tion of freehand 3d ultrasound. Medical Im-age Analysis, 3(2):129–40, 1999.

[13] Diego C. Ruspini, Krasimir Kolarov, andOussama Khatib. The haptic display ofcomplex graphical environments. Com-puter Graphics, 31(Annual ConferenceSeries):345–352, 1997.

[14] W. Schroeder, K. Martin, and B. Lorensen.The Visualization Toolkit, an object-oriented approach to 3D graphics.Prentice-Hall Inc., 1998.

[15] A. Siddiqui and B. Hagglof. Does ma-ternal prenatal attachment predict postnatalmother-infant interaction? Early Hum. Dev.,59:13–25, 2000.

[16] Sensable Technologies. The phantomsystem. www.sensable.com.

[17] F. Tendick and et al. A virtual environmenttestbed for training laparoscopic surgicalskills. Presence, 9(3), 2000.

[18] T. Yoshikawa, Y. Yokokohji, T. Matsumoto,and X. Zheng. Display of feel for themanipulation of dynamic virtual objects.Trans. ASME, Journal of Dynamic Systems,Measurement, and Control, 117(4):554–558, 1995.

[19] C. Zilles and J. Salisbury. A constraintbasedgod-object method for haptic display. InProc. IEE/RSJ International Conference onIntelligent Robots and Systems, HumanRobot Interaction, and Cooperative Robots,volume 3, pages 146–151, 1995.

7

List of Figures

1 The visual and haptic interactiongenerates a closed feedback loopin the overall system. . . . . . . . 8

2 The VR workstation comprisinghaptic and visual feedback. Thehand moves the stylus of thePHANToM desktop to touch the3D VR model of the fetus. . . . . 8

3 The main user-interface windowof the developed software. . . . . 9

4 The 3D surface model of the fetusbuild from 2D ultrasound imagesand used for haptic and visualinteraction. . . . . . . . . . . . . 10

8

1

Visio-Haptic

Feedback

Model

Generation

2D Ultrasound Images

Image

Processing

Surface

Model

Figure 1: The visual and haptic interaction generates a closed feedback loop in the overall system.

Figure 2: The VR workstation comprising haptic and visual feedback. The hand moves the stylus ofthe PHANToM desktop to touch the 3D VR model of the fetus.

9

Figure 3: The main user-interface window of the developed software.

10

Figure 4: The 3D surface model of the fetus build from 2D ultrasound images and used for haptic andvisual interaction.