award number: damd17-00-1-0697 phantom users principal … · 2011. 5. 13. · aad award number:...

62
AAD Award Number: DAMD17-00-1-0697 TITLE: Support for conference entitled "The Fifth PHANTOM Users Group Workshop" PRINCIPAL INVESTIGATOR: Karl Reinig, Ph.D. CONTRACTING ORGANIZATION: University of Colorado Health Sciences Center Aurora, Colorado 80045-0508 REPORT DATE: February 2001 TYPE OF REPORT: Final Proceedings PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 DISTRIBUTION STATEMENT: Approved for Public Release; Distribution Unlimited The views, opinions and/or findings contained in this report are those of the author(s) and should not be construed as an official Department of the Army position, policy or decision unless so designated by other documentation. 20010548 142

Upload: others

Post on 05-Feb-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

  • AAD

    Award Number: DAMD17-00-1-0697

    TITLE: Support for conference entitled "The Fifth PHANTOM UsersGroup Workshop"

    PRINCIPAL INVESTIGATOR: Karl Reinig, Ph.D.

    CONTRACTING ORGANIZATION: University of ColoradoHealth Sciences Center

    Aurora, Colorado 80045-0508

    REPORT DATE: February 2001

    TYPE OF REPORT: Final Proceedings

    PREPARED FOR: U.S. Army Medical Research and Materiel CommandFort Detrick, Maryland 21702-5012

    DISTRIBUTION STATEMENT: Approved for Public Release;Distribution Unlimited

    The views, opinions and/or findings contained in this report arethose of the author(s) and should not be construed as an officialDepartment of the Army position, policy or decision unless sodesignated by other documentation.

    20010548 142

  • Form ApprovedREPORT DOCUMENTATION PAGE OMB No. 074-0188

    Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining

    the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for

    reducing this burden to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of

    Management and Budget, Paperwork Reduction Project (0704-0188), Washington, DC 20503

    1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVEREDFebruary 2001 Final Proceedinqs (1 Sep 00 - 28 Feb 01)

    4. TITLE AND SUBTITLE 5. FUNDING NUMBERS

    Support for conference entitled "The Fifth PHANTOM Users DAMD17-00-1-0697

    Group Workshop"

    6. AUTHOR(S)Karl D. Reinig, Ph.D.

    7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION

    I Jniversitv of Colorado Health Sciences Center REPORT NUMBER

    Aurora, Colorado 80045-0508

    E-Mail: [email protected]

    9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING / MONITORINGAGENCY REPORT NUMBER

    U.S. Army Medical Research and Materiel Command

    Fort Detrick, Maryland 21702-5012

    11. SUPPLEMENTARY NOTESThis report contains colored photos

    12a. DISTRIBUTION / AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE

    Approved for public release; distribution unlimited

    13. ABSTRACT (Maximum 200 Words)

    The Fifth PHANTOM Users Group (PUG2000) brought together, in an intimate setting,

    participants who are actively engaged in making computer haptics practical and useful

    through the use of the PHANTOM Haptic Interface. While most participants were expected to

    be active PHANTOM users, we encouraged participation by non-PHANToM users who bring

    special perspectives relevant to the community's interests.

    Although many venues for discussion of computer-based haptics have emerged in recent

    years, there is particular value in bringing together individuals who share the common

    platform embodied in the PHANTOM haptic system. In doing so, we were able to make

    realistic comparisons of the progress we have made in the development of the hardware,

    architectural, algorithmic and psychophysical components that comprise haptic systems.

    More importantly, we forged ways to build on each other's results and guide the

    development of the PHANTOM haptic system to provide for our common needs.

    14. SUBJECT TERMS 15. NUMBER OF PAGES

    Haptics, Simulation 6216. PRICE CODE

    17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. LIMITATION OF ABSTRACT

    OF REPORT OF THIS PAGE OF ABSTRACT

    Unclassified Unclassified Unclassified Unlimited

    NSN 7540-01-280-5500 Standard Form 298 (Rev. 2-89)Prescribed by ANSI Std. Z39-18298-102

  • Table of Contents

    Table of Contents ............................................................................... 3Introduction .................................................................................. 4Body .......................................................................................... 4

    CHS Open House ......................................................................... 4Tutorial................................................................................... 11

    Key Research Accomplishments.........................................................11.IWorkshop ................................................................................ 11IHaptic Demonstrations.................................................................11.I

    Reportable Outcomes .................................................................... 11.Conclusions ............................................................................... 11.I

    Budget.................................................................................... 11IFuture...................................................................................11.

    Appendix.................................................................................... 13

    3

  • Final Report: Support for conference entitled "The Fifth PHANToM Users GroupWorkshop"

    Award No.: DAMD 17-00-1-0697

    IntroductionAt the time we applied for funding for the Fifth Phantom User's Group Workshop (PUG2000) it seemed unlikely that it would occur, putting future PUGs in doubt as well.TATRC's funding not only kept the premier Phantom based haptics workshop going, italso laid the foundation for its future growth.

    PUG 2000 had 45 registered attendees. All involved in organizing the event consideredthis to be a remarkable success considering the late announcement and new venue. It is atribute to the core members of the workshop. Of the eleven accepted papers, nine hadindirect application to medical simulation; three had direct application. The papers servedthe same dual purpose as in previous PUGs. They were informative on their own and theyprovided fuel for offline discussion. Offline discussion is, in our opinion, the mostimportant part of the workshop. The intimate atmosphere of the small mountain town ofAspen is particularly conducive to both cerebral and organizational networking.

    Body

    CHS Open HousePUG 2000 began with an open-house at the University of Colorado's Center for HumanSimulation (CHS). The CHS is located on the former Fitzsimons Army Medical Center inAurora, Colorado.

    The open-house included 6 stations. The first two stations were simulator prototypes. Thefirst of these demonstrated a 3-D knee that could be palpated, cut with a scalpel, orinjected or aspirated with a needle/syringe. The workstation is shown below with theinjection/aspiration mode selected and all but the bones and ligaments of the knee madetransparent.

    4

  • 7-;

  • The following figure shows incisions produced by the simulator.

    some options•

    Rotx Roty • Dolly

    The second simulator prototype allowed the user to practice corneal incisions on a virtualeye. The prototype keeps track of all scalpel motion and tells the user if they have started,finished, or wandered outside of an accepted tolerance during a cut. It also has a testingmode that gives the user a score. The simulator is ergonomically very similar to cuttingreal tissue under a microscope. We have prototyped a study, that produced encouragingresults, to measure skills acquired with this simulator. We are currently seeking IRBapproval to conduct a formal study. This prototype was built to demonstrate that we havedeveloped the fundamentals required to create a phaco-emulsification simulator on whichto practice cataract removal. We feel that such a simulator could have a large impact onophthalmic training, bringing a significant benefit to society.

    6

  • Some Options

    - /

    T,~

    Roix Roty 1 Dolly

    The third station showed the current state of our Explorable Virtual Human (EVH) beingdeveloped as part of a Next Generation Internet (NGI) contract with the National Libraryof Medicine. The EVH is an authoring/display tool for virtual anatomy. The models thatwe are producing from the Visible Human data have greatly improved the reality ofanatomy available for display in virtual environments. Our "Solid Shells" technique givesthe models the fidelity of ray-traced models while rendering at VR rates. The EVH willmake these models available to a diverse audience of developers.

    The following is a screen capture of one page of the EVH as it was demonstrated at PUG2000. The window on the right is a stereoscopic anatomic display of the knee as it flexes.The window on the left displays tightly coupled HTML. Using the EVH editor, the userselects from the available models, creates virtual camera motions, and toggles thevisibility of structures as a function of time. The resulting Interactive Anatomic

    7

  • Animations (IAAs) are played back in conjunction with the HTML. Whatever anatomicstructure the cursor covers identifies itself in the HTML window (Proximal Tibia(Right)). Questions may be inserted using the HTML and the results used to determinethe next IAA. This fall, the EVH will include haptic interaction similar to the twodemonstrations shown above, making it a powerful tool for the production of surgicalsimulator content.

    P);J ý -I

    *-J -6~2 -~ Jý a4wk Stop R*ee Hope Seoch Fposie Ho"o Ma P f R Ed

    Essentials: Joint Overview - Joint

    Classification

    TProximalTIbia (Right)

    Based on the characteristics of the three structuralclasses, which would you predict to be freely movable?Hint: movement requires lubrication.

    A. Fibrous - held together by fibrous tissue.

    B. Cartilaginous - united by hyaline cartilage orfibrocartilage.

    C. Synovial - characterized by the presence of a fibrouscapsule lined by a synovial membrane and a joint cavityfilled with synovial fluid.

    What type of functional class is the knee joint?

    !A. Synarthrosis

    11B. Amphiarthrosis

    1C. Diarthrosis

    Of the three structural classes, which is always freelymovable?

    SA. Fibrous_

    R ... t >>

    I____ 1 a V 0 -.... .... 4.. .. .....I .4j N5.. MI MoINGI,, ..I -AjEwb ..iv J3OVie" 4W 1[7 4:43 PM

    The fourth station demonstrated an anatomic navigator designed to help teach ultrasoundanatomy to GI physicians. The "oblique maker" is an intuitive interface for theproduction of oblique views of Visible Human type data. The user manipulates a flatplate through 3-D models of their choosing. When they select "Update Oblique" they aregiven the corresponding oblique slice through the Visible Human Male. Structures in theoblique image identify themselves when selected by the mouse. The oblique maker runsover the net http://140.226.200.110:8080/web3d When released in May, the obliquemaker will include corresponding ultrasound images.

    8

  • DapelNa de M-e We Type dl M'C.x sloeleeMc 00pcle00

    SFz d ob43p. Dj r Me In.v. Pl.e. IR7 -- ---j dee J Tp.MpqI s.e.s. I coe~.eo 5 -:r MmeAJMod,4 j S-eJPEG I Leb.I. 00 F.WdeV-e flE. -1

    CJ Mel, Gd-e Leee.. e F- Led

    N] ... ee T eflo W, eee C

    0JObio. M"k.MJO'Cq-Me1,-

    Orj -e H..,e Lbee0,,0 D e , Syo; m --eo g00 ,0

    eJPRISM De-, G--oee SOe@j Pc,,N &46P-neoMF-e

    The fifth station used the Center's one-wall cave to demonstrate our "Dissector"program. The dissector allows the user to select structures for identification, then removethem if desired. The dissector presents the results using passive stereo. The following is ascreen capture of the dissector in use. Here, the structures anterior to the kidneys havebeen removed and the liver is currently hi-lighted.

    9

  • The sixth station was a tour of our cutting area. The Fitzsimon' s morgue was given to usin pristine condition. It has significantly improved our ability to create Visible Humantype data. One of the data sets we have produced since arriving at Fitzsimons is our 11tmmn knee. The knee has more data in it than the entire Visible Human Male. It is beingused in our partnership with the American Academy of Orthopaedic Surgeons to producean arthroscopy simulator.

    10

  • TutorialAuthur "Ted" Kirkpatrick (University of Oregon, [email protected] ) gave a GhostTutorial near the end of the CHS open house. This carefully planned tutorial used thestandard Ghost Tutorial developed by SensAble Technologies as a prerequisite. This hadthe effect of leveling the field amongst the students and allowed the tutorial to be moreadvanced than usual. Ted posed problems to the students and then discussed theiranswers as well as his own. Attachment one is the handout from the class.

    Key Research Accomplishments

    WorkshopThe paper session of the workshop ran from 08:30 - 17:30 hrs. During this time we had12 presentations (the e-Touch presentation was given two slots). This is similar to theschedule of PUGs past. This schedule is probably more intense than is optimal and is

    addressed in the Future section.

    Haptic DemonstrationsJohn Ranta (from Teneo Computing, [email protected]) brought 5 haptic devicesand computers, on loan from SensAble technologies. He set up a separate demo room thatran throughout the workshop. Software for the demos was provided a priori by theattendees. Getting various haptic demonstrations running on machines that are slightlydifferent then where they were developed has always been a pretty big task, but it createsa powerful addition to the workshop. It is one thing to discuss haptic algorithms, it isquite a different thing to share them. John handled the task smoothly.

    Reportable OutcomesThe PUG 2000 conference proceedings are available on line at:http://www.cs.sandia.gov/SEL/conference/pugOO/pugOO.htm and are included in theappendix.

    Conclusions

    BudgetOur estimated budget came from discussions with former PUG sponsors and the previousPUG executive committee. Holding the conference in Aspen had two interesting effectson the budget. We were able to get the world-class facilities of the Given Institute forvery little money $1,325.00. However, meals were more expensive than average. Hotelrates (approximately $85.00/ night) reflected the off-season. The overall effect was thatthe expenses came in very close to the estimate. We were able to financially assist 12attendees. This helped to expand the student population. Two people were paid asconsultants, our conference coordinator, Elizabeth Orlando and her assistant CatherineDarkenwald.

    11

  • FuturePUG 2000 was the first conference hosted by the CHS and consequently provided aconsiderable educational experience for us. I am pleased to say that to the best of myknowledge we hid the difficulties of our learning curve from the attendees. This wasmostly due to the tenacious organizational skills of our conference coordinator ElizabethReinig. Originally, having the CHS host the PUG workshop was supposed to be a one-time event. But based on the remarks of the attendees and the rest of the executivecommittee we feel compelled to host it again in 2001. Our hope is that this will be liketeaching a class for the second time. We should be able to do an even better job with lessdisruption to our core missions. The following lists three of the changes that we plan tomake for next year:

    1) The workshop should have two full days of papers and activities (one additional day).Due to time constraints we had to turn away quality papers. We anticipate the quantity ofquality submissions to increase next year since the program will be announced severalmonths in advance. The extra day will also allow for additional forums to facilitatediscussion.

    2) The CHS would have not been able to host this workshop without the $25,000.00provided by TATRC. We will be asking for your help again this year. However, we planto tighten our belts to include the extra day without requiring extra funds. In addition, wewill seek other sponsorship in order to reduce the Army's load.

    3) It is not our intent to become the perennial host of the workshop. The field of haptics ismuch larger than medical simulation and consequently its hosting should probably bespread around to the other interested industries. However, as long as we are hosting itagain, we plan to increase the emphasis on medical simulation beyond the CHS openhouse of PUG 2000.

    12

  • Appendix

    Attached is a copy of the proceedings. The proceedings may also be viewed at:http://www.cs.sandia.gov/SEI,/conference/pug00/pug00.htm

    13

  • A Constraint Equation Algebra as a Basis for Haptic Rendering

    Matthew HutchinsCSIRO Mathematical and Information Sciences [6]

    GPO Box 664, Canberra, ACT 2601, Australia.Matthew.Hutchins@ cmis.csiro.au

    Abstract haptic representation, which may be as simple as a singleMany haptic rendering problems can be expressed in point.

    terms of constraints on the motion of a proxy within avirtual environment. This principle is well establishedfor surface rendering, and can also be applied to othertypes of haptic interaction. A key problem in generalconstraint based rendering is combining constraintsfrom several sources into a single unified constraint.This paper describes some work in progress towarddeveloping a mathematical framework for manipulatingmotion constraint equations, and in particular the deri- Figure 1: PHANToM and proxy.vation of a combination algebra for constraints. Thiswork could lead to a system for 6DOF rendering involv-ing non-trivial proxy shapes. The two major haptic rendering SDKs/APIs (Software

    Development Kits or Application Programming Inter-

    Keywords: haptics, constraints, proxy, 6DOF. faces) currently available (that we know of) are GHOSTfrom SensAble [8] and Magma from Reachin [7]. Both

    1. Introduction provide separate facilities for shape based surface ren-

    dering and abstract haptic effect or force field rendering.In a previous PUG paper [4] (and see also [2,3]) we Only the surface rendering interfaces allow manipulationdescribed an approach to haptic rendering based on the of the proxy position (in GHOST called the Surfaceuse of constraints, and in particular the use of a con- Contact Point or SCP). Thus to program a haptic con-strained proxy. The use of a proxy (or god-object) for straint using the proxy technique, one must implement

    haptic surface rendering is described in [5], and is now a straint using the srfaceninre, o r implement

    wellestblihedtechiqu. Te bsic deais hatthe the constraint using the surface interface, or implement awell established technique. The basic idea is that the second proxy. The second proxy solution is messy, andphysical movements of the PHANTOM are tracked by a doesn't integrate well with surfaces that use the built-invirtual proxy object moving in a virtual environment, as dentitgaewl ihsrae htuetebitivhownirtualgpro bec mon ieas vitul envronmoen s proxy. The surface interfaces are, naturally enough, goodshown in Figure 1. Whereas the PHANToM moves for effects that act like surfaces, but awkward to use for

    freely in space, the proxy object will encounter virtual mor eneral ects.

    objects and fields which will constrain or change its

    motion. The difference between the free motion of the The context of the work described in this paper, then, isPHANToM and the constrained motion of the proxy is the development of a new approach to specifying hapticused as the basis for generating contact forces. This tech- rendering that unifies surface and other types of render-nique is used for surface rendering by preventing the ing into a single framework based on constraining theproxy from moving from one side of a surface to the motion of a proxy. This is work in progress, with theother. It can also be used for other types of haptic inter- majority of the framework still under development. Thisactions, for example: constraining the proxy to a line or paper describes some of the mathematical formalismsplane. A graphical representation of the proxy will usu- that have been developed so far for specifying andally be rendered in the user interface to provide multi- manipulating constraints on the motion of a proxy.modal feedback. The graphical and haptic properties ofthe proxy are not necessarily the same - the graphics To provide slightly more context for the mathematics,may be considerably more elaborate than the haptic ren- consider the problem of implementing a haptic scene-dering, for example. In this paper, "proxy" refers to the graph object, which could be a solid shape, a deformable

    shape, or some abstract force field. At each traversal of

  • the scenegraph, the object may be required to solve the time. That is, the difference between two configurationsproblems of collision detection and contact registration. over an interval t will beContact registration means, upon detecting a collision,registering the contact with the rendering system. The (A, h, xt, ot)contact is the focus of a two-way communication for given constants A, 4, x and (o. Thus the completebetween the scenegraph object and the rendering system.

    The system must combine the effects of all registered motion can be represented by a tuple (A, h, x, co) whichcontacts to produce a new proxy position, an output can be directly computed from the difference betweenforce, and dynamic information to feed back to the the initial and final configurations of the PHANToMscenegraph objects to update their internal state (e.g. over a sampled interval.deform). One aspect of the contact is the local topologyin a neighbourhood of the contact point. This can be As a body moves in space over time, a point P on theexpressed as constraints on the motion of the proxy body will move through a curve or trajectory in spacearound the contact point. It is essential that the topologi- which we can describe as P(t). For the proportionalcal information from separate objects, which act inde-pendently of each other, can be combined by the systemto give a single result. This is the motivation for the alge- trajectory of a point is given by the equation

    braic treatment described later in the paper. P(t) = Po + (-r) + xt_ + rcos(cot) + (1)(b & r)&[ I - cos(cot)] + (,a x r)sin(cot)

    2. Motion and constraintsWe will assume that the proxy is a rigid body, and its where P 0 is the initial position of P and _r = (A - P0 ).

    motion is described by rigid body kinematics [1]. At any The tangent of the trajectory at any point is given by theinstant in time, the configuration of a body in space can derivative of the curve at that point with respect to time:be described by its position and orientation with respect P'(t) = xa-corsin(cot) + (2)to some fixed reference "origin". If the configuration of abody in motion is sampled at discrete times, the differ- (b

    a r)& sin ((ot) + w (a x r)cos (w t)

    ence between any two such configurations can be repre- This tangent can be thought of as describing the direc-

    sented1 by a tuple tion that the point P is moving in at time t. Initially, at

    (A, i, x, (o) time t = 0, the tangent is

    where A is a point, & is a unit vector, x is a distance and P'(0) = x& + io(h x (A - P0 )) (3)

    co is an angle. Here A is called the anchor point, and We call xa the translational component of the tangent

    together with & defines a line which is an axis of rota- and wo(a x (A - P0)) the rotational component.tion. The difference in configurations can be interpretedas the effect of a translation of the body along the axis by Our goal is to be able to express constraints on the com-

    a distance x, and a rotation around the axis counter- plete motion of the proxy as the combined effect of sim-

    clockwise by the angle co. pie constraints on the motion of individual points in theproxy. One way these constraints would arise is if a point

    We assume that the motion of the PHANToM is an arbi- on the surface of the proxy was in contact with the sur-

    trary continuous motion sampled at discrete times. We face of an object in the scene. A constraint on the motion

    wish to approximate this motion over a sampling interval of a point can be expressed as a constraint on the tangent

    by a simple substitute motion that is easily represented. of the trajectory of the point under the motion. For the

    We choose a form of screw motion where the translation remainder of this paper we will make two further simpli-

    distance and rotation angle change proportionally over fying assumptions:

    we express motion constraints as constraints on the

    tangent P'(0) at the start of a sampling intervalonly, not on the trajectory over the interval;

    1. This representation is possible due to a corollary to Euler's theoremthat Goldstein [1, p.163] attributes to Chasles.

    2

  • we wish to constrain the translational component * ParaTan (P, h): the tangent of P at t = 0 is par-and the rotational component of the tangent sepa- allel to unit vector h.rately.

    A more general approach will be left for future work. • FixPoint(P): the tangent of P at t = 0 is zero,

    However, this simplified version may serve as a good so that P is fixed.enough approximation for the purposes of haptic render-ing. • Fixed: the entire body is fixed.

    We can identify the set of motions permitted by each of

    3. Constraint equation algebra these cases by using the tangent equation Eqn. (3). First,To recap, we have a representation for a particular class some notation for parallel and perpendicular vectors. We

    of rigid body motions as tuples of the form define

    M = (A,b,x, co) U 1v- (Y X V =0)- (4)

    and we have an equation (Eqn. (3)) that describes a tan- _u I v (u v 0)gent of the trajectory of a point under such a motion. By Then, remembering the assumption that the translationspecifying a condition that must be satisfied by the tan- component and the rotation component will be con-

    gent of some point P, we can identify a set of motions strained independently, we can derive the following defi-that will make the tangent satisfy the condition. Thus a nitions:constraint equation on the tangent defines a set of legalmotions. Clearly, there is a wide variety of conditions (A, b, x, (w) e Free TRUE (5)

    that can be placed on the tangent. We wish to choose a (A, a, x, o) e PerpTan(P, in) (6)useful subset of these conditions and develop a mathe- {(x = 0) OR (b I h)} ANDmatical and computational framework for manipulating {(x= 0) OR

    (( x (A-P 0 ))I h)}

    them. In other words, we wish to develop a constraint

    equation algebra. (A, b, x, co) c ParaTa n (P, h)= (7){(x =0) OR (bII _h)} AND

    An algebra, in the most general sense as used in alge- {(co = 0) OR ((_a x (A - P0 )) II )}braic software specification, is simply a collection of setsand functions and relations that satisfy some chosen axi- (A, b, x, (w) E FixPoint(P) (8)oms. To specify an algebra, we need to define what sets {x = 0} AND

    there are, what elements are in them, and what operators {((o = 0) OR (a II (A - P0)and relations act on those sets. This is analogous to A & x o) E Fixed-defining an abstract data type in software. In this case, fx = 0} AND f9) = 0(

    we wish to define a set of constraint equations, and a sin-gle operator to combine pairs of constraint equations. Wewill want the operator to be idempotent, commutative The zero motion that satisfies Fixed will satisfy all ofand associative, so the resulting algebra will have the the other constraints. A motion that satisfies

    form of a semi-lattice. FixPoint(P) must be a pure rotation around an axisthrough P, and will satisfy any other constraint on P.

    4. The base casesWe start by defining a set of constructors, or generators, 5. Compositionor "base cases" for the set of constraint equations. These We now define a composition operator on constraintare the building blocks which will be combined to create equations, denoted C, @ C2 . The motions that satisfythe complete set. As we have described, we wish todefine these mostly in terms of constraints on the tangent the combination C1 ® C2 should be precisely those that

    of a particular point at the start of an interval. A useful satisfy both of the constraints C, and C 2 . Thus weset of base cases is: define

    "• Free: free motion of the body. Me(C1 C2)(MeC1 )AND(M C 2 ) (10)

    "* PerpTa n(P, h): the tangent of P at t = 0 is per- Another way of saying this is that CI E C 2 is the inter-pendicular to unit vector h. section of the sets C, and C 2 . Thus, we know this com-

    3

  • position operator satisfies the axioms required of a semi- fore it is best to try to match the rotation component oflattice (idempotency, commutativity, associativity), the motion as closely as possible. This will minimise thebecause set intersections do. discrepancy between the orientation of the haptic device

    and the orientation of the proxy, which will minimise theThe complete set of elements in the algebra is therefore required torques. Note that the screw motion representa-all those generated by the five base cases, plus the com- tion naturally isolates the translation and rotation com-position of any two other elements. This is essentially a ponents of the input motion.recursive definition, and computationally would requirea recursive data structure to represent the elements. As described in the introduction, this formulation of con-However, we can make some observations to simplify straint equation algebra is just one part of a larger speci-this. Firstly, suppose that all constraints must be applied fication for a new approach to haptic rendering. Ourto the same point P. This would be the case for a 3DOF future work will be the continued development andrendering system with a single point proxy, or, such as in implementation of this approach.Magma, a small spherical proxy where all constraints aretranslated to apply to the centre of the sphere. It turns out Referencesthat the five base cases completely characterise the sys- [1] Herbert Goldstein. Classical Mechanics. Secondtem. That is, every combination of two or more con- edition, Addison-Wesley, 1980. Chapter 4, pages 128-straints applied to the same point are equivalent to a 187.simple constraint applied to the same point. Mostly, thereslt f C @C2 s ethr C orC2 r ix~intP) [2] Chris Gunn and Paul Marando. Haptic constraints:result of C, ) C2 is either C, or C2 or FixPoint(P). guiding the user. In proceedings of SimTecT '99,

    The only interesting case is Melbourne, Australia, March 1999. Pages 261-264.

    NOT(-h1 ILh2)= [3] Matthew Hutchins. Software components for haptic(PerpTa n (P, P (, -h2) = ParaTan(P, _h x -) constraints. In proceedings of the SPIE Vol. 3957,Stereoscopic Displays and Virtual Reality Systems VII,

    For the general case where constraints can be applied to 2000. Pages 423-432.

    different points on the proxy, there are definitely more [4] Matthew Hutchins and Chris Gunn. A haptic constraintselements required. However, it appears that the five base class library. In Salisbury, J.K. and Srinivasan, M.A.cases plus the six pairwise combinations of the non-triv- (Eds), Proceedings of the Fourth PHANTOM Usersial base cases will be enough to completely characterise Group Workshop, At Lab Technical Report No. 1675the space. So, all constraints could be represented in a and RLE Technical Report No. 633, MIT, November

    flat data structure with eleven types of elements. The 1999.

    proof of this conjecture is work currently in progress. [5] C.B. Zilles and J.K. Salisbury. A constraint-based god-object method for haptic display. In proceedings of the

    6. Future work 1995 IEEE/RSJ International Conference on IntelligentRobots and Systems, August 1995. Pages 146-151.The combination algebra developed above makes it pos-

    sible to reduce a set of independent constraints to a sin-gle constraint that must be satisfied by the motion of the Web Citesproxy. Given a potential motion and a constraint, there is [6] CSIRO Mathematical and Information Sciences.a straightforward decision procedure to determine if the http://www. cmis. csiro. aumotion satisfies the constraint. However, if a motion of [7] Reachln Technologies.the proxy does not satisfy the constraint, it is necessary http://wvw. reachin. seto find an alternative motion that does. This is always [8] SensAble Technologies.possible (the zero motion satisfies any constraint), so in httpSe /Vbweseno les.fact the problem is to find the "best" alternative motion. http://w .sensable, CornThe optimal solution may be different for each of theeleven constraint cases (of course, the Fixed and Freecases are easy!) Once the development of the algebra iscomplete, this will be the next problem to be solved.Some cases have been solved already. We assume that, atpresent, most haptic rendering will be done using a3DOF output device, or a 6DOF device where the rota-tional fidelity is less than the translational fidelity. There-

    4

  • Fuzzy Rule-Based Evaluation for a Haptic and Stereo Simulator forBone Marrow Harvest for Transplant

    Liliane dos Santos Machado (1)Ronei Marcos de Moraes (1,2)

    Marcelo Knorich Zuffo (1)

    • Laborat6rio de Sistemas Integrdveis - Universidade de Sdo PauloSao Paulo - SP - Brazil

    (liliane, ronei, mkzuffo}@Isi.usp.br

    (21 Statistics Departament - Federal University of Paraiba

    Jodo Pessoa - PB - Brazilronei@de .ufpb.br

    Abstract. Virtual Reality has been used to simulate procedures in severalfields, especially those where critical tasks are involved as simulation ofsome invasive medical procedures. We are developing a low-cost haptic andstereo simulator for bone marrow harvest for transplant. The systemincludes an intelligent evaluation procedure that allows classify the traineelearning. The present paper describes the proposed system, details of itsimplementation and results we just obtained.

    Introduction

    Bone marrow transplant, despite commonly held perceptions, is not a usual surgery.Basically, the bone marrow transplant consists of an infusion of healthy cells, capable ofgenerating identical copies of themselves and producing blood cells. This blind invasiveprocedure is relatively simple, but the success of the procedure will depend on thephysician's dexterity, and his ability to manipulate the needle in a complex anatomicalregion.

    This work presents a virtual reality system to simulate bone marrow harvest fortransplant. Bone marrow transplant, despite commonly held perceptions, is a semi-invasive procedure that depends on the physician's dexterity, once there is no visualinformation of the patient body internal structure.

    In this simulator we are using a haptic device and stereo view glasses to give animmersion degree satisfactory to the user trainee. To give an evaluation of the trainingwe add to the system an intelligent evaluation tool based on fuzzy rules. This way, theexpert doctor knowledge is modeled by fuzzy rules by four variables to give aclassification of the procedure performed by the trainee.

    Motivation

    The bone marrow transplant is a relatively new medical procedure to treat recentlyconsidered incurable diseases. The first success transplant was made in 1968, and sincethen has been a current procedure for patients with leukemia, aplastic anemia,lymphomas, multiple myelomas, disturbs in the immunology system and in some solidtumors such as the breast cancer and ovarian cancer [Oncolink, 1999].

  • The process to extract the bone marrow is made through many material aspirations fromthe iliac crest bone marrow (sometimes it includes the sternum bone also) from thedonator under general anesthesia. The procedure is a blind procedure without any visualfeedback except the external view of the donor body, the physician need to feel the skinand bone layers trespassed by the needle to find the bone marrow and then start thematerial aspiration. From the physicians point of view the bone marrow harvestdemands great ability, which will offer a better recovery to the donator and less pos-harvesting pain. Particularly on children the bone marrow harvest for transplant iscritical considering that bones in this case are thin and soft, and that the patient receive asmaller degree of anesthesia. The Children's Institute of Hospital das Clinicas de SdoPaulo - Brazil realize on average 15 procedures every year. Currently the only trainingprocedure available for novice doctors is training with guinea pigs, real procedureobservation and further supervision by physicians in real procedures.

    In the same way, it is known that expert physicians evaluate trainee learning observingthe needle position, its angle when inserted in the body of the patient and how deep it is,beyond the fact the trainee extract the bone marrow (goal of the procedure). That means,the expert doctor evaluation is partially subjective. One way to model subjectiveknowledge is using fuzzy sets [Dubois and Prade, 1980]. In our system physiciansknowledge in bone marrow harvesting is modeled by fuzzy rules by the four variablesdescribed bellow (needle position, angle, depth and bone marrow extraction) to give aclassification of the procedure done by the trainee.

    With our system, we intend to improve the learning for novice doctors once thedexterity necessary will be acquired through the training on a virtual reality system,which will simulate and evaluate the procedure with a force-feedback device integrated.We expect to reduce and improve the learning curve affecting donors' rehabilitation.

    Intelligent Evaluation Systems

    Intelligent evaluation is called the one made by an expert. In intelligent evaluationsystems, the expert knowledge is stored in a knowledge database using some logicrepresentation, generally by rules. The rules utilization control is done by an inferencesystem. The architecture formed by the knowledge database and the inference system iscalled "expert system". When there is subjectivity in the knowledge database itsrepresentation can be done by fuzzy models [Zadeh, 1988], where the subjectivity ismodeled by fuizzy sets.

    In fuizzy expert systems the expert's knowledge about a specific task is modeled byfuzzy rules. The variables of interest are collected by a subsystem and sent to the expertsystem. The data provided by the variable values will be analyzed by the rules database.Each rule is related to a variable of interest and each expert can have his own conclusionabout a specific fact. The several conclusions about one rule are aggregated to composea new fact. This fact will be analyzed by the rules set providing a conclusion about thefacts presented.

    For the evaluation, we utilize an expert system joined to the simulator. To evaluate atrainee, the expert knowledge about the procedure is modeled by fuzzy rules. Thevariable values are "collected" by the haptic device and sent to the expert system to beanalyzed by the rule database. The pertinent facts of a rule are joined and analyzed toprovide a final classification of the trainee. We are using five types of fuzzy

  • classifications to a trainee: you need much more training, you need more training, youneed training, your training is good or your training is excellent, allowing identify ifmore training is or is not necessary.

    Proposed System and Implementation

    The proposed system is a semi-immersive virtual reality system [Pimentel, 1995] wherethe trainee and some expectators doctors (tutor and trainee) can share the samestereoscopic view of the bone marrow harvest procedure simulation [Machado, 2000].A high end PC Pentium III 600Mhz platform with and AGP 3Dlabs Oxygen GVX1board including a time-multiplexed Stereo Graphics Crystal Eyes shutter glasses[Stereographics, 1997] and a Phantom Desktop haptic device composes our simulator[Sensable, 1999 and Sensable, 2000]. The simulator consists in a force feedback virtualinteractive model of tissue layers from the pelvis region and its hardness and texturecharacteristics.

    Using a virtual syringe with tactile feedback (simulated by the Phantom Desktop) theuser can penetrate thought the several tissue layers feeling the transitions among tissues,as well as feeling the texture associated to each layer.

    Perfusion direction

    4-- Epidermis___ Dermis

    _Subcutaneous_____ _. .. ..._4_ " " Periosteum

    & o4 Compact bone

    M mma m a bone marrow

    _Compact bone

    Figure 05 - The Perfusion Tissue Layers

    The feedback sensation associated to each layer was modeled from tactile sensationdescriptions done by specialists. So, we modeled the several physical properties of thetissues in the iliac crest in the following layers:

    "* Epidermis: approximately 2 mm thick, elastic and slippery tissue;"* Dermis: approximately 7 mm thick, elastic tissue;"* Subcutaneous: approximately 4 mm thick, soft and non-resistant tissue;"* Periosteum: approximately 2 mm thick, resistant, slippery, lubricated and

    smooth tissue."* Compact bone: approximately 5 mm thick, hard and resistant tissue;"* Bone marrow: approximately 10 mm thick, soft tissue, without resistance.

    The fuzzy rules of our expert system are modeled by membership functions accordingto specifications of experts. Several types of membership functions can be used as

  • trapezoidal, triangular and pi-functions and the fuzzy inference system used isMamdani-type [Mamdani, 1975]. An example of rule for this expert system is:

    IF Position_x is lefticenter AND Position-y is upcenter AND Position-needle isacceptable AND Marrowharvest is yes THEN Trainee_class is you-need training

    where: Position_x, Position-y are coordinates which the needle touch the patient body;Positionneedle is the angle of needle input to body of patient; Marrowharvest showsthe success or failure of trainee to harvest bone marrow and Traineeclass is theclassification of trainee.

    Acknowledgements

    This project is funded by Fundado de Amparo A Pesquisa do Estado de Sdo Paulo,grant # 99/01583-0, with additional support from FINEP - Brazilian Federal FundationAgency.

    Thanks to Dr. Andre Nebel de Mello and Dr. Vicente Odone Filho of Children'sInstitute of Hospital das Clinicas for their helpful insights about bone marrowharvesting.

    Bibliography

    Dubois, D.; Prade, H. Fuzzy Sets and Systems: Theory and Applications. AcademicPress, 1980.

    Machado, L. et al.; A Virtual Reality Simulator for Bone Marrow Harvest forTransplant. Proceedings of 3th Brazilian Workshop of Virtual Reality, October, 2000.

    Mamdani, E. H.; Assilian, S.; An Experiment in Linguistic Synthesis with a FuzzyLogic Controller, International Journal of Man-Machine Studies, vol.7, pp. 1-13, 1975.

    Oncolink; URL from the Cancer Center at the University of Pennsylvania (USA).http://cancer.med.upenn.edu/specialty/med-onc/bmt/, August, 1999

    Pimentel, K.; Teixeira, K. Virtual Reality - through the new looking glass. 2.ed.,McGraw-Hill, 1995.

    Sensable; Ghost SDK Programmer's Guide Version 3.0. Sensable Technologies Inc.,March 1999.

    Sensable; Ghost API Reference Version 3.0. Sensable Technologies, Inc. March 2000.

    Stereographics; Developer Handbook, StereoGraphics Corp., 1997.

    Zadeh, L.A.; Fuzzy Logic, Computer, vol. 1, pp. 83-93, 1988.

  • Haptic Collaboration over the InternetJodo P. Hespanha, Margaret McLaughlin, Gaurav S. Sukhatme

    Minoo Akbarian, Rajiv Garg, Weirong Zhu

    Integrated Media Systems CenterUniversity of Southern California

    Los Angeles, CA 90089

    Abstract

    We address the real-time collection and simultaneous broadcast of haptic information to multiple haptic sessionparticipants, so that collaborative exploration of objects is possible, even when users are equipped with disparate hapticdevices, such as the PHANToM and the CyberGrasp. We have designed and are currently testing a prototype system forhaptic collaboration over the Internet. The basic idea is to provide a framework for multiple users (each with his or her ownhaptic device connected to a computer) to share a common experience of touch. This will allow users to exert forces on eachother through the network as well as exert forces on common objects.

    In this paper we present a distributed architecture for haptic collaboration via the Internet. We explicitly address theissue of latency (communication delay), thus providing a foundation for a shared haptic experience among distributed users.With respect to stability, latency is a critical factor that governs whether two users can truly share a common hapticexperience. We propose an algorithm where the nature of the interaction between two hosts is decided dynamically based onthe measured network latency between them. Users on hosts that are near each other (low communication latency) aredynamically added to fast local groups. If the communication latency is high, users are allowed a slower form of interactionwhere they can touch and feel objects but cannot exert forces on them. Users within a fast local group experience true hapticcollaboration since the system is able to resolve the interaction forces between them fast enough to meet stability criteria. Wediscuss the creation, maintenance and update mechanisms of local groups for fast interaction, as well as synchronizationmechanisms for hosts participating in slower interaction. We conclude with a discussion of open issues and future work.

    1. Introduction

    Haptic (adj): of or relating to the sense of touch. In the present context, haptic refers to the modality of touch and thesensation of shape and texture an observer feels when exploring an object in a virtual environment. Applications of hapticsinclude online museums [6], aid for the visually impaired, remote surgery and entertainment. In many of these applications itwill be necessary for users to interact with each other as well as with other objects. In this article, we propose anarchitecture for haptic collaboration among distributed users. We focus on collaboration over a non-dedicated channel (suchas an Internet connection) where users experience stochastic, unbounded communication delays [7].

    The area of haptic collaboration is relatively new. There have been a few prior studies that we briefly review here. In astudy by Basdogan et. al. [1], partners at remote locations were assigned three cooperative tasks. Experiments wereconducted with visual feedback only, and with both visual and haptic feedback. Both performance and feelings oftogetherness were enhanced in the dual modality condition. Durlach and Slater [3] note that factors that contribute to a senseof co-presence include being able to observe the effect on the environment of actions by one's interlocutors, and being able towork collaboratively with co-present others to alter the environment. Buttolo et. al. [4] note that when the same virtualenvironment is shared between two distributed sites there may be registration problems. Representations of the virtual objectmust coincide, but the distributed nature of the communication, especially over the Internet, may introduce considerablelatency whose effects may be hard to predict.

    2. Virtual Haptic World

    Imagine you decide to go to a handicraft museum. There is a map of the museum at the door showing different halls inthe museum, each containing a group of handicrafts. Upon entry into a hall, you can see the handicrafts and the other peoplein that room. You can touch all of the objects in the room and interact with them. In a real museum, all of the above arefamiliar experiences, except for the last one. As a matter of practice, touching art objects is usually strictly prohibited.

    The scenario described above motivates the research presented here. Our goal is to design an architecture that willsupport collaborative touch in virtual environments. We term such environment a virtual haptic world. As shown in Figure1, users may have different kinds of haptic devices, such as the PHANToM, CyberGrasp, or a FEELit mouse, or they can justbe viewers. Some of the participants in the haptic world may only provide virtual objects as a service to the remaining users.This would be the role, e.g., of a museum's server.

  • Object provider Visualization

    Figure 1: A virtual haptic world

    From a computational perspective, a haptic world consists of a network of nodes. Each node corresponds to a computerwhose operator is part of the shared virtual environment. The operator will typically interact with virtual objects through ahaptic device, but conceivably, some users may interact with the haptic world using other modalities, e.g. by simplevisualization. Some nodes may operate autonomously (i.e., without a human operator) and simply provide virtual objects forthe haptic world.

    Each node in the haptic world contributes to the shared environment with virtual objects. These can be static, e.g., asculpture "bolted" to the ground, or dynamic, e.g., a teapot that can be virtually manipulated. We view the haptic devices thatthe human operators use to interact with the haptic world as dynamic objects. Each object in the haptic world is owned by oneof the nodes, which is responsible for defining how its dynamic properties evolve. Typically, a node that is physicallyconnected to a haptic device owns the object that represents the device.

    Two databases are used to represent a haptic world. The node database contains information about the node network. Itstores the logical identifiers and the lIP addresses of all nodes, as well as the latency and available bandwidth between allnodes. The need for this information will become clear later. This database is dynamic because new nodes may join or leavethe haptic world at run-time. The object database contains the information about all objects that are part of the haptic world.Each record in this database refers to a particular object and it contains the object identifier, the identifier of the node thatowns it, its static properties (shape, size, color, etc.) and its dynamic properties (position, orientation, velocity, etc.).

    The force control algorithms used for haptic rendering generally require high sampling rates (typically, on the order ofI1KHz) and low latency (typically, on the order of a few milliseconds) [5]. This means that the databases need to be queriedvery frequently and with very low delay. Because of this it is necessary to distribute these databases by keeping local copiesat each node. This allows for very fast access to the data about the objects that is needed for the force feedback loops, at theexpense of the added complexity introduced by issues related to the consistency between the databases. Much of whatfollows is precisely related to the problem of keeping the databases synchronized so that all nodes have roughly the sameperspective on the shared environment.

    3. Database Synchronization

    Since the object database contains data that is dynamic, the local copies of this database that exist at each node must bekept synchronized by a periodic exchange of data. This is done by a very simple mechanism that uses the concept of objectownership introduced earlier: periodically, the owner of each object broadcasts the dynamic properties of its objects to allother nodes. Each node must then continuously listen to the other nodes for updates on the dynamic properties of the objectsthat it does not own. This is represented schematically in Figure 2.

  • Listen for dynamic properties of objects ownedby other nodes and update local object

    - t•il databaseQuery haptic rendering system for dynamic

    properties of objects ownedBroadcast dynamic properties of objects owned

    .. Wait for one synchronization period

    Node Local Database Data transfer between nodes

    Figure 2: Object database synchronization in the haptic Table 1: Pseudo-code for object databaseworld. synchronization

    Typically, the haptic rendering system uses the following fairly standard algorithm:

    Compute amount of overlapping between objects owned and all other objectsCompute forces on objects owned (assuming spring-damper system)Transmit forces to haptic deviceIntegrate forward in time to predict dynamic properties of objects owned at next sampling time

    Table 2: Pseudo-code for the update of dynamic properties of objects

    When the number of nodes is large, the broadcast of object properties required by the algorithm in Table I may be costlyunless the synchronization period is large. We will address this issue later.

    Another main challenge arising from the distributed nature of the databases that store the information about the hapticworld is related to the addition and removal of nodes from the haptic world. When a new node joins the haptic world, it mustfirst receive the current node and object databases from some other node in the haptic world. It must then add itself to thenode database and add its objects to the object database. Finally, it must inform all other nodes of these changes to thedatabases. This is implemented by the pseudo-code shown in Table 3 that must run in every node.

    Request copy of node database Input: G = {list of objects in local group)Request copy of object databaseAdd self to node database While (1)Add objects owned to object databaseBroadcast request to add new record to node database Listen for the dynamic properties of objects in G ownedBroadcast request to add new records to object database by other nodes and update local object databaseWhile (node active) Query haptic rendering system for dynamic propertiesI of objects ownedListen for requests to: Broadcast to the owners of the objects in G the dynamic

    send node/object database properties of objects in G owned by selfadd/remove record to/from node database Wait for one local group synchronization period.add/remove record for/from object database

    Broadcast request to remove self from node databaseBroadcast request to remove owned objects from object

    database

    Table 3: Pseudo-code for the creation of a new node in Table 4: Pseudo-code for local group synchronizationthe haptic world

  • 4. Local Groups

    The broadcast required by the synchronization algorithm in Table 1 can be very costly when the number of nodes islarge. Because of this, the synchronization period may need to be fairly long. For static objects this poses no problems, butthe same is not true for dynamic objects, i.e., objects that can move.

    When two or more dynamic objects touch each other, the resulting motion must be computed by simulating Newton'slaws using an algorithm similar to the one in Table 2. However, when the same node does not own all the objects involved ina close interaction, each object only observes the effect of its motion in the motion of other objects at a relatively lowsampling rate, determined by the synchronization period. This leads to very unrealistic motions (and possibly instability)because the algorithm in Table 2 no longer provides a good approximation to Newton's law. We overcome this by creatingsmall groups of nodes that engage in very fast and very frequent exchange of synchronization data for objects in closeinteraction. The creation of these groups is, of course, only possible when the bandwidth between the nodes is sufficientlylarge and the latency is sufficiently small. Because of the high cost of local groups, these should only be maintained while theobjects are interacting.

    As explained above, to resolve the motion of objects involved in close interaction a high bandwidth/low latencysynchronization mechanism is needed. In our architecture this is achieved by introducing the concept of a local group. Alocal group consists of a group LG of objects, whose owners enhance the basic synchronization algorithm for those objects inLG, by decreasing the synchronization-sampling period. The local group synchronization algorithm, given in Table 4, is verysimilar to the basic one in Table 1.

    Since each local group determines the positions of all the objects in that local group, each object should belong to, atmost, one local group (this does not prevent a node that owns several objects from being involved in several local groups).Moreover, the fast synchronization within the local group requires high bandwidth and low latency between the nodesinvolved. Special care must therefore be paid to the creation of local groups.

    * LG-3 .... a..................O......MR -2 0-4U

    .0-7

    Data transfer at 0high rate aG4**.........

    LG2 10Request to form alocal group

    Figure 3: Haptic world with three local groups and a Figure 4: New local group, after the request in Figure 3node requesting to create a local group was processed

    We use an example to illustrate the issues involved in the management of local groups. Consider the haptic world shownin Figure 3. In this figure we see three local groups: LG-1 is formed by the set of objects {0-2, 0-4), LG-2 is formed by 10-3, 0-5, 0-6), and LG-3 is formed by {0-7, 0-8). Note that the same node owns the objects 0-2 and 0-3 but they are part ofdistinct local groups. This means that, although belonging to the same node, these objects are not in close proximity andtherefore their motions are independent. Suppose now that the user at the node that owns 0-1 wants to use 0-1 to manipulatethe objects 0-2, 0-5, and 0-7 (Figure 3). This requires the creation of a local group that contains T = [10-1, 0-2, 0-5, 0-7 1.However, since some of these objects are already part of other local groups, the old local groups LG-1, LG-2, LG-3 must bedestroyed and a new local group LG-4 must be created, containing the objects in T as well as those in the old local groupsLG-1, LG-2, and LG-3 (Figure 4). This only occurs if the network connections between all the nodes that own the objects inquestion have sufficiently large bandwidth and sufficiently low latencies for the local group synchronization.

    The pseudo-code in Table 5 implements the algorithm used to create a new local group. The pseudo-code in Table 3also needs to be modified as shown in Table 6 to process the requests generated by the algorithm in Table 5.

  • Input: T = (desired list of objects in new local group}

    L = Expand (T) % determine list of all objects that need {...}% to be included in new local group

    If Feasible (L) While (node active)I % Only create local group if all nodes involved I

    % satisfy the bandwidth and latency requirements Listen for requests to:For each I E L send node database

    Request owner of object to destroy the local group send object databaseto which it belongs add/remove record to node database

    For each I r L add/remove record for object databaseRequest owner of object to create a local group for create/destroy a Iocal group

    objects in LReturn Success

    ElseReturn Failure

    Table 5: Pseudo-code to create a new local group Table 6: Modification in the pseudo-code in Table 3 toprocess the requests generated by Table 5.

    5. Conclusions and Future Work

    We proposed an architecture for the real-time collection and simultaneous broadcast of haptic information to multiplehaptic session participants, so that collaborative exploration of objects is possible, even when users are distributed across anetwork. The architecture relies on two distributed databases: the node and the object databases. These two databases aredynamic and need to be kept coherent among all nodes in the virtual haptic world. We presented pseudo-code for thealgorithms that keep these databases synchronized. These algorithms are independent of the actual haptic devices employedby each user.

    In future work, we hope to make significant progress on the registration of the haptic display systems in collaborative-networked environments. We will also examine the necessary entities to achieve networked collaboration with disparatehaptic devices (pen-based versus glove-based, small versus large workspace). We plan to address not only integration issuesbut also questions related to the interaction process itself, including feelings of co-presence and performance satisfaction, andhow these variables are affected by the exploration modality (vision, vision plus haptic or haptic only). Another line ofresearch is the development of force control algorithms tailored to a distributed haptic environment. These algorithms mustbe robust with respect to the stochastic delays caused by the communication network.

    6. References

    [I] C. Basdogan, C. Ho, M. Slater, and M. A. Srinivasan, "The Role of Haptic Communication in Shared Virtual Environments,"PHANToM Users Group, 1998 (http://www.sensable.com/communitv/PUG98/19 basdogan.pdf).

    [2] C. Ho, C. Basdogan, M. Slater, N. Durlach, M. A. Srinivasan, "An Experiment on the Influence of Haptic Communication on theSense of Being Together," 1998 (http://www.cs.ucl.ac.uk/staff/m.slater/BTWorkshop/touchexp.html).

    [3] N. Durlach and M. Slater, "Presence in Shared Virtual Environments and Virtual Togetherness," 1998(http://www.cs.ucl.ac.uk/staff/m.slater/BTWorkshop/durlach.html).

    [4] Buttolo, P., J. Hewitt, R. Oboe, and B. Hannaford, "Force Feedback in Virtual and Shared Environments," 1997(http://brl.ee.washinyton.edu/BRL/publications/Rep100.ps).

    [5] Wilson, J. P, Kline-Schoder, R. J., Kenton, M. A., and Hogan, N, "Algorithms for Network-Based Force Feedback," in Salisbury, J.K. and Srinivasan, M. A. (Eds), Proc. PHANTOM Users Group Workshop, MIT 1999.

    [6] McLaughlin, M. L., Sukhatme, G., Hespanha, J., Shahabi, C., Ortega, A., & Medioni, G "The Haptic Museum," Proc. EVA 2000,Conference on Electronic Imaging and the Visual Arts, 2000.

    [7] McLaughlin, M. L., G. Sukhatme, J. Hespanha, C. Shahabi, and A. Ortega, "Touch in Immersive Environments," Proc. EVA 2000Conference on Electronic Imaging and the Visual Arts, Edinburgh, Scotland, 2000.

  • Using Haptics and Sound in a Virtual GalleryBridget BairdCenter for Arts & TechnologyConnecticut College

    IntroductionGalleries are traditionally places for visual exploration of objects; concert halls provide auditoryexploration. The tools of virtual reality allow for a new kind of gallery: one that encompassesfeatures of a traditional visual museum, means for auditory discovery, and in addition, hapticexploration. The user is invited to browse through this virtual gallery, interacting with theobjects, feeling their textures, listening to their audio properties, moving around and inside them.All of this takes place in an interactive, 3D) environment where the user navigates and exploreswith her eyes, ears, and hands.

    The Center for Arts and Technology at Connecticut College has provided an atmosphere forinteractive collaboration. In the past, many of our projects have involved the interaction betweena 3D, visually rich world, and sound. The possibilities for haptic exploration in such a settingwere very intriguing, not just for the sake of expanding our digital sensory capabilities to includetouch, but also because of the opportunity to explore the interactions and reinforcement amongthe senses.

    DescriptionThe virtual gallery is set inside a hemisphere, with multi-modal sculptures placed near the edge.From the center, where the user enters, the objects are not all clearly visible, although light andvague forms give clues to their presence. At the center of the world the user experiences a darkand foggy atmosphere, with rumbling sounds. As she moves towards the edges of the worldthere is more light as well as visual and auditory clarity. Nearing an object causes it to come intofocus, both visually and through sound. The rumbling noise that she experiences in the center ofthe world lessens and begins to blend with the individual auditory properties associated with theobject itself. Navigation is by means of a joystick, which allows her to move forward orbackward, up or down, and yaw from side to side. The user also has stereo eyeglasses, whichincrease the feeling of immersion in the three-dimensional world. The user's right hand controlsa phantom. The model for navigation is that of a person walking through a world, holding ahand out in front. In this virtual world there is no gravity, and both the objects and the user canfloat in space. Other laws of physics are partially observed, depending on the object: some canbe bumped into with the phantom, some are impenetrable, others can be entered.

    The virtual gallery is programmed in Visual C++ on a PC and uses Sense8's World Toolkitlibraries as well as the Ghost 3 libraries. Models were created in a dxf format and then translatedto VRML2, for both the haptic and visual worlds. Because of the addition of the World Toolkitlibraries it was necessary to create dual haptic and graphic worlds.

    Ilaptic InteractionsEach of the five objects possesses visual, audio, and tactile properties. As the user nears theobject, the sound(s) associated with that object become louder. The audio is spatialized so that

    Baird -I

  • as the user moves around the world, the audio changes. Each object also has hapticcharacteristics. All of these characteristics interact with each other.

    One of the objects is multi-faceted and has many angles on its surfaces. The sound associatedwith that object is voices which speak and overlap. The surface of the object has some give to itso the user can not only feel the object, but push on it. It is also possible to push through andenter the object, either exploring solely with the phantom, or moving hand and head into theobject. If the user enters this object with her head (ears), the audio files exhibit reverberation andthe voices seem to echo and bounce off the interior walls. On the inside of the object the user isalso able to explore or push back through to the outside. Another of the objects has two movingarms mounted on a central cylinder. Contact with the arms causes them to start swinging andalso triggers their individualized sounds. The arms, because they can be in motion, have nohaptic properties, although the central cylinder does. Another object consists of interlockingrings. When the user enters the negative space of the holes in the rings she experiences a hapticbuzzing, which is accompanied by a sound. The rings themselves have a rather hard surface andas the user presses on them with the phantom she causes a change to a higher pitch in theirsound. Pressure on the rings and the rise in pitch are correlated on a sliding scale. One of theother objects is in motion as the user approaches, with the haptics turned off. When the usertouches it (as determined in the graphics world) the haptics are activated and the user is able toexplore its surface. Another object has hollow bumps or hemispheres on its surface and as theuser goes up inside these hemispheres the viscosity of the phantom increases while at the sametime the pitch of the associated sound is lowered. This causes the sensation of moving inside athick substance where time has slowed. Throughout all of our design we have looked for waysin which the multiple senses can interact and reinforce one another.

    Phantom ChallengesThe necessity for including virtual reality libraries made it impossible to use the OpenGL Ghostenvironment. Thus it was necessary to create dual haptic and graphic environments. Thissituation was made more complex because Ghost and Sense8 have different coordinateconventions.

    A more major difficulty occurred in setting up navigation for the phantom. Because the user isfree to move around the world, it was necessary to translate the center of the phantom (in thevirtual world) and also adjust its direction. Navigation is by means of a joystick held in the lefthand; the joystick represents the head (and body) of the user. We restricted navigation toforward/backward, yawing from side to side, and up/down elevator motion (using a button).This navigation seemed to be simple enough to master quite easily, and flexible enough to movethe user where she wishes to go with relative ease. The right hand operates the phantom; thevisual point of contact (in the virtual world) of the phantom is represented by a small, slightlyglowing sphere. The paradigm is that of a human: the right hand can move, but it is attached tothe body and so as the body moves through the gallery, the hand moves with it, maintaining itsrelative position. The difficulty that this presented was that movement with the joystick couldeasily cause the phantom to come in contact with an object, causing a conflict. Initially this gaverise to a constant vibration in the phantom. The (somewhat OK) programming fix was to firstcheck to see if the phantom is in contact with any of the objects. If it was, we don't update its

    Baird -2

  • position. If not and if either the magnitude of the translation or the magnitude of directional shiftis greater than some fixed lower bounds, update the phantom. Some slight adjustments had to bemade when considering the particular object into which the user can enter: in this case we had toset touchableByPhantom to FALSE, move the phantom, and then set it back to TRUE. For all ofthe objects we still encoucntered the problem of coming up in the middle of an object and thuscausing excessive force, but could not find a good way around this. Setting the objects so theycan't be felt by the phantom begs the question. We thought of moving the phantom away fromthe object slightly so that we wouldn't have immediate contact, but then we ran into the problemof what constitutes the inside and the outside of objects, and this question is already vexingenough for our graphics. The problem of navigation is certainly the largest one we encounteredand we have not solved it to our satisfaction.

    Other problems were more standard. We get low frame rate, even with a dual processor and agraphics card. This is probably because, in addition to the graphics and haptics, we are doingreal-time sound processing: spatializing, mixing sounds together, changing pitches, addingreverberations, etc. Another problem is the proliferation of devices: manipulating a world thatincludes eyeglasses or HMD, sound, joystick and phantom creates a cumbersome environment,but a manageable one.

    Future ResearchThe project we are currently working on is really a test bed for using haptics, vision, and audio ina 3D world. We have been very pleased with the results so far. We are particularly excitedabout the opportunities that haptics presents in these types of arts and technology collaborations.There are some additional tests we would like to do in the present project to make the interfacerun more smoothly and to get the frame rate down. Future work of this nature includes a projectto explore an abandoned abbey (Tintern) by re-creating the original structure, allowing the userto explore and feel. We will use textures from photographs and we envision the user being ableto touch the walls, causing them to crumble into the ruins of today. We have also been talkingwith colleagues about the possibility of using haptics to "feel" the envelopes of musical sound,manipulating audio properties in an interactive manner. The Center for Arts and Technology isan ideal venue for taking haptics in some of these new directions.

    Bridget Baird, Professor of Mathematics and Computer ScienceJudith Ammerman '60, Director of the Center for Arts and TechnologyConnecticut CollegeNew London, CT 06320(860) 439-2008http://math.conncoll.edu/facultv/[email protected]

    Collaborators: Ozgur Izmirli programmed the audio portion, David Smalley modeled the 3Dobjects, and Don Blevins assisted with general programming.

    Baird - 3

  • Pressure Masks for Point-like Contact with Elastic Models

    Doug L. James' and Dinesh K. Pai1,2

    University of British Columbia

    Abstract nesses for elastostatic models and then use these to con-sistently define the surface's stiffness.

    In this paper, we introduce pressure masks for support- Much of this material is presented in much greatering the convenient abstraction of localized scale-specific detail in [JP]; throughout, an identical notation is used.point-like contact with a discrete elastic object. Whilethese masks may be defined for any elastic model, spe-cial attention is given to the case of point-like contactwith precomputed linear elastostatic models for pur-poses of haptic force-feedback. . . . . .. .

    Figure 1: Point Contact Must Not be Taken Literallyfor Elastic Models : This figure illustrates the devel-

    1 Introduction opment of a displacement singularity associated with aconcentrated surface force as the continuum limit is ap-

    It has long been recognized that point contact is a proached. In the left image, a unit force applied to aconvenient abstraction for haptic interactions, and the vertex of a discrete elastic model results in a finite vertexPHANToMTM haptic interface is a testament to that displacement. As the model's mesh is refined (middlefact. While it is possible to consider the contact area and right image), the same concentrated force load even-to be truly a point for rigid models, this is not possible tually tends to produce a singular displacement at thefor elastic models, as infinite contact pressure can lead contact location, and the stiffness of any single vertexto various inconsistencies. The solution is simply to as- approaches zero (see Table 6).sume the contact zone has tractions distributed over afinite surface area. We propose to do this efficiently andconsistently by introducing pressure masks for definingnodal traction distributions. This addresses at least twocore issues. First, having a point contact with force dis- 2 Linear Elastostatic Model Back-tributed over a finite area is somewhat contradictory, groundand the traction distribution is effectively an underde-termined quantity without any inherent spatial scale.This is resolved by treating the contact as a single dis- Precomputed linear elastostatic models of various dis-placement constraint whose traction distribution enters cretization origins are efficient candidates for real timeas a user (or manipulandum) specified parameter. The haptic interaction [BC96, JP99, JP]. A general bound-distribution of force on the surface of the model can then ary Green's function description is now very briefly pre-be consistently specified in a fashion which is indepen- sented for use in §3.dent of the scale of the mesh. Second, given the modelis discrete, special care must be taken to ensure a suffi-ciently regular force response on the surface, since irreg- 2.1 Nodal Displacement and Traction Variablesularities are very noticeable during sliding contact mo-tions. By suitably interpolating nodal force responses, Consider a discrete elastic model with n surface nodes,displacement constraints can be imposed which will re- e.g., polyhedral mesh vertices, for which nodal quanti-sult in regular haptic force-feedback, ties are defined. Specifically, let the surface displace-

    The pressure mask approach is particularly effective ment u(x) and traction p(x) fields be parametrized byfor haptics when used with linear elastostatic models n-vectors of nodal variables,with precomputed Green's functions, since force re-sponse can usually be computed at 0(1) cost. In §2,minimal definitions and notation for discussing the elas- u = [u, ... ,u•]T (1)tostatic model are presented. Afterwards, in §3, the P = [p,'..., P,]T, (2)construction and definition of pressure masks is given,and it is shown how to compute nodal (or vertex) stiff- where each of the values Uk and Pk belong to aR3 . Since

    our boundary element implementation uses vertex-lInstitute of Applied Mathematics based triangle mesh models, we shall often refer to a2 Dept. of Computer Science, {dj amos I pai}Qcs.ubc. ca node as a vertex.

  • 2.2 Reference Boundary Value Problem Note that this applies to any discrete linear elastostatic(RBVP) Definition model, regardless of internal material properties or thediscretization technique employed.

    A major benefit of using linear elastostatic models for

    haptics is that it is possible to precompute the Green'sfunctions to one particular class of boundary value prob- 3 Surface Stiffness Models for Point-lem (BVP), a relevant reference BVP (RBVP), and be like Contactable to efficiently compute components of those solu-tions rapidly at run time (see Figure 2). This section presents the pressure mask approach for

    Without loss of generality, assume that either posi- elastic models (§3.1), then specializes to linear elasto-tion or traction constraints are specified at each bound- static models for which the pressure masks may be usedary node. Let the mutually exclusive nodal index sets to compute vertex stiffnesses (§3.2) which are in turnA°. and A° specify nodes with displacement and trac-~used to compute the surface stiffness (§3.3).tion constraints, respectively, so that AO n A° = 0 andA° U A° = {1, 2, ... ,n}. Specifying boundary values ateach of the n nodes defines a BVP to be solved for de- 3.1 Pressure Masks for Distributed Point-like

    sired unknown variables, e.g., haptic contact forces, at Contactseach step of the simulation. Denote the unspecified and In this section, pressure masks are defined and used tocomplementary specified nodal variables by specify the traction distribution associated with force

    p j EA° n r 5i : jEA° applied via a masked vertex constraint.vj iU and V 1 0EAuj j JEA°O

    Pj : YEA°

    (3) 3.1.1 Discrete Traction Space Definitions

    respectively. In order to characterize traction distributions for the

    discussion of mask construction and the smoothness offorce response, it is necessary to define a discrete scalar

    Free Boundary; AO function space, L, on the model's boundary, F. Let

    Fixed Boundary; AO £ = span {IO(x), j = ... n, x E F}, (6)

    -- HH1 llllllllwhere Oj(x) is a scalar basis function associated with

    Figure 2: Reference Boundary Value Problem (RBVP) the jth node. The traction field is then a vector function

    Example: The RBVP associated with a model attached whose components lie in £,

    to a rigid support is shown with boundary regions hay-ing fixed (A°.) or free (A°) nodal constraints indicated.A typical haptic simulation would then impose contacts = = (on the free boundary nodes, A°.

    2.3 RBVP Solution using Green's Functions ,

    The general solution of the RBVP is conveniently ex-pressed using Green's functions of the RBVP as

    V = . (4) Figure 3: Collocated Scalar Masks: A direct meansv E= 1uj- + E (4 for obtaining a relative pressure amplitude distributionEAO, 3EAO° about each node, is to employ a user-specified scalar

    functional of the desired spatial scale. The scalar pres-where the reference system Green's functions (RSGFs) sure mask is then given by nodal collocation (left), afterare the block columns of the matrix which the vector traction distribution associated with a

    nodal point load is then computed as the product of[ K12 6n] c R (5) the applied force vector and the (compactly supported)scalar mask (right).

    The jth RSGF describes the effect of the jth node's spec-ified boundary value, ivj. In practice it is only necessaryto compute RSGFs for nodes which may have changing 3.1.2 Pressure Mask Definitionnonzero boundary values during the simulation.

    Since the RSGFs only depend on the RBVP and the Scalar relative pressure masks provide a flexible meansgeometric and material properties of the deformable ob- for modeling vector pressure distributions associatedject, they may be precomputed for use in a simulation. with each node. This allows a force applied at the ith

  • node to generate a traction distribution which is a linearcombination of {Oj(x)} and not just el(x).

    In the continuous setting, a scalar surface densityp(x) : F -- Rl will relate the localized contact force fto the applied traction p via

    p(x) = p(x)f

    which in turn implies the normalization condition Figure 4: Illustration of Changing Mask Scale: Exag-gerated pulling deformations clearly illustrate different

    I = spatial scales in the underlying traction distribution. In

    Jrp(x)d~x 1. (7) each case, pressure masks were automatically generatedusing the linear spherical mask functional (see §3.1.3)

    In the discrete setting, the surface density on r is for different values of the radius parameter, r. This ex-ample shows (left) a single vertex mask, and (right) a

    p (x) 1 j (x)p E L, (8) mask involving several nearby vertices. Note that inx = x ,each case the surface has been once refined using Loop

    subdivision.and is parameterized by scalar pressure mask vector,

    p = [p1, P2,..., where r specifies the radial scale'. The effect of chang-ing r is shown in Figure 4.

    Substituting (8) into (7), the discrete normalization con-dition satisfied becomes 3.2 Vertex Stiffnesses using Pressure Masks

    a Tp 1, (9) Having consistently characterized point-like force loadsusing vertex pressure masks, it is now possible to cal-

    where culate the stiffness of each vertex. In the following sec-= f i(x)dFx (10) tions, these vertex stiffnesses will then be used to com-

    f, pute the stiffness at any point on model's surface fordefines the vertex area. Notice that the mask density p haptic rendering of point-like contact.has units of '--

    areaIn practice, the vertex pressure mask p may be speci- 3.2.1 Elastic Vertex Stiffness

    fled in a variety of ways. It could be specified at runtime,e.g., as the byproduct of a physical contact mechan- For any single node, i, on the free, i E A0v, or rigidly fixedics solution, or be a user specified quantity. We shall boundary, iEAU, a finite force stiffness, Ki E ×3 X3, mayconsider the case where there is a compactly supported be associated with its displacement, i.e.,scalar function p(x) specified at each vertex on the freeboundary. The corresponding pressure mask p may thenbe defined using nodal collocation (see Figure 3), Given a force f applied at vertex i E A°, the correspond-

    " &~xj), j E A'97 ing distributed traction constraints are

    P= 0, j E AU.• pj = p f. (11)

    followed by suitable normalization to satisfy (9). Then using (4), the displacement of the ith vertex isIn the following, denote the density mask for the ith

    vertex by the n-vector p*, with nonzero values being in- = Z Eijp = Z fdicated by the set of masked nodal indices Mi. Since jeM. jEMithe intention is to distribute force on the free bound-ary, masks will only be defined for i E A°. Additionally, so that the effective stiffness of the masked vertex isthese masks will only involve nodes on the free bound- -1ary, MAi C A, as well as be nonempty, IM,[ > 0. /Ki= >3 pjE , ieA°. (12)3.1.3 Exam ple: Spherical M ask Functionals E_ _ _ _

    _ _

    Spherically symmetric mask functionals with a scale pa- It follows from (4) and (11) that the corresponding glob-rameter were suitable candidates for constructing vertex ally consistent solution ismasks via collocation on smooth surfaces. One exam-ple, which was commonly used (see Figure 4 and 6), is (a functional with linear radial dependence, v = (if P3 p%) f (13)

    P ( X ; r ) - Ix { o i s -o x i s < b, t he if•(;r 0, otherwise. ir may be thought of as the size of the haptic probe's tip.

  • where (i is the convolution of the RSGFs with the mask 3.3 Surface Stiffness from Vertex Stiffnessesand characterizes the distributed force load. Given the vertex stiffnesses, {Kj};>__, the surface stiff-

    ness is defined using nodal interpolation# Vertices [ISingle IK I2 Masked IKII2

    34 7.3 13.3 n130 2.8 11.8 K(x)=>3Cj(x)Kj, xEr, (15)514 1.1 11.2 j=l

    Figure 5: Vertex Stiffness Dependence on Mesh Reso- so that (K(x))•, E L. Note that there are usually only alution: This table shows vertex stiffness magnitudes small number of nonzero terms in the sum of (15). In(arbitrary units) for a BEM model at three different this way, the surface stiffness may be continuously de-Loop subdivision mesh resolutions. The stiffness corre- fined using only IAoI free boundary vertex stiffnesses andsponding to a single vertex constraint exhibits a large a single rigid stiffness parameter, k iRd , regardless of thedependence on mesh resolution, and has a magnitude extent of the masks. The benefit of pressure masks iswhich rapidly decreases to zero as the mesh is refined, clearly visible in Figure 6 for piecewise linear 1.On the other hand, the stiffness generated using a ver- It follows [JP] that the global deformation corre-tex pressure mask (collocated linear sphere functional sponding to the displacement constraint 0i applied on(see §3.1.3) with radius equal to the coarsest mesh's the free boundary at xEr ismean edge length) has substantially less mesh depen-dence, and quickly approaches a nonzero value. /

    v 3 =xf = >,(x)f. (16)iAo EAo \iEM /

    3.2.2 Rigid Vertex Stiffness We note that this may be interpreted as an elastostatic

    For surfaces of rigid models, a finite force response may generalization of force shading [MS96].be defined using an isotropic stiffness matrix,

    KR =Rigd1 3 E R3X3 , Rgid 4 Summary and Conclusion

    We have introduced pressure masks for the consistentThis is useful for defining a response at position con- definition of forces arising from point-like haptic in-strained vertices of a deformable model, teractions. This leads to a computationally efficient

    means for obtaining regular surface force responsesKj = KR, j E A°, (14) from discrete elastostatic models. Experiments using

    a PHANToMTM interface confirmed that the pressurefor determining contact responses on neighbouring tri- masks produced a perceptible improvement.angles which are not rigid.

    References[BC96] Morten Bro-Nielsen and Stephane Cotin. Real-

    time volumetric deformable models for surgerysimulation using finite elements and condensa-

    (a) a(x) (b) IIK(x)11 2 (c) IIK(x)11 2 tion. Computer Graphics Forum, 15(3):57-66,August 1996.

    Figure 6: Effect of Pressure Masks on Surface Stiff- [JP] Doug L. James and Dinesh K. Pai. A unifiedness: Even models with reasonable mesh quality, such DuL.Jme s and Dines K. ntai t A ifeas this simple BEM kidney model, can exhibit haptically ulation for real time haptics. To appear.perceptible surface stiffness irregularities when single-vertex stiffnesses are used. A plot (a) of the vertex area, 1JP99] Doug L. James and Dinesh K. Pai. Art-a, clearly indicates regions of large (dark red) and small Defo: Accurate Real Time Deformable Ob-(light blue) triangles. In (b) the norm of the single- jects. Computer Graphics, 33(Annual Confer-vertex surface stiffness, IIK(x)112, reveals a noticeable ence Series):65-72, 1999.degree of mesh-related stiffness artifacts. On the otherhand, the stiffness plotted in (c) was generated using a [MS96] Hugh B. Morgenbesser and Mandayam A. Srini-pressure mask (collocated linear sphere functional (see vasan. Force shading for haptic shape percep-§3.1.3) of radius twice the mesh's mean edge length) and tion. In Proceedings of the ASME Dynamicsbetter approximates the regular force response expected Systems and Control Division, volume 58, 1996.of such a model.

  • LEM - An approach for physically based soft tissue simulationsuitable for haptic interaction *

    Remis BALANIUK Ivan F. COSTACenter for Advanced Techn. in Surgery SHARP/GRAVIR

    Stanford University INRIA Rh6ne-Alpesremis~robotics.stanford.edu Ivan-Ferreira. Costaginrialpes.fr

    Abstract

    This paper presents LEM - Long Elements Method, a new approach for physically based simulation of deformableobjects, suitable for real time animation and haptic interaction. The method implements a static solution forelastic global deformations of objects filled with fluid based on the Pascal's principle and volume conservation.The physics of the objects are modeled using bulk variables: pressure, density, volume and stress. The volumesare discretised in long elements. This discretisation has two main advantages: the number of elements used to fillan object is one order of magnitude less than in a discretisation based on tetrahedric or cubic elements; the graphicand the haptic feedback can be directly derived from the elements, and no intermediate geometric representatio