[ieee 2008 ieee virtual reality conference - reno, nv, usa (2008.03.8-2008.03.12)] 2008 ieee virtual...

2
Integrating Gyroscopes into Ubiquitous Tracking Environments Daniel Pustka * Manuel Huber Gudrun Klinker Institut f ¨ ur Informatik, Technische Universit¨ at M ¨ unchen ABSTRACT It is widely recognized that inertial sensors, in particular gyro- scopes, can improve the latency and accuracy of orientation track- ing by fusing the inertial measurements with data from other sen- sors. In our previous work, we introduced the concepts of spa- tial relationship graphs and spatial relationship patterns to formally model multi-sensor tracking setups and derive valid applications of well-known algorithms in order to infer new spatial relationships for tracking and calibration. In this work, we extend our approach by providing additional spatial relationship patterns that transform incremental rotations and add gyroscope alignment and fusion. The usefulness of the resulting tracking configurations is evaluated in two different sce- narios with both inside-out and outside-in tracking. Keywords: Augmented Reality, Tracking, Calibration, Sensor Fu- sion, Gyroscopes, Inertial Sensors Index Terms: H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems—Artificial, augmented, and vir- tual realities; I.3.1 [Computer Graphics]: Hardware Architecture— Input devices 1 I NTRODUCTION In our previous work, we introduced the concepts of spatial rela- tionship graphs (SRGs) [3] and spatial relationship patterns [4], which allow for formally modeling relationships between the dif- ferent coordinate frames in a tracking setup and for describing the operations performed by a tracking/calibration algorithm. The goal of this work is to derive new spatial relationship patterns that allow us to integrate gyroscopes into our formal framework and to show that this results in useful sensor fusion configurations. Related Work Hybrid tracking setups, consisting of inertial sensors combined with other means of tracking, are a well-studied field of research. Azuma [2] has introduced gyroscopes into AR, and the topic has been further investigated by many others. Inertial sensing has also enjoyed some attention in the robotics community. 2 SPATIAL RELATIONSHIP GRAPHS AND PATTERNS A Spatial Relationship Graph (SRG) is a graph which captures the structure of a tracking environment. The nodes of the graph rep- resent the coordinate frames or orientation free points of real or virtual objects, while the directed edges represent the actual infor- mation about relationships between these objects. A Spatial Relationship Pattern represents a subgraph template which captures the structural properties of an algorithm for tracking or calibration. Patterns have input (dashed lines) and output (solid lines) edges, that describe the new relationships that can be inferred from given data by a particular algorithm. Starting from an initial * e-mail:[email protected] e-mail:[email protected] e-mail:[email protected] SRG with only the pure tracking data, a chain of pattern applica- tions can be used to construct a dataflow network that computes a particular relationship at runtime [4]. 3 I NCREMENTAL ROTATION We start the discussion of gyroscope integration by deriving some basic patterns for transforming incremental rotations. Given two sequential rotations r t 1 and r t 2 at times t 1 and t 2 , we can express r t 2 by r t 1 multiplied by an incremental rotation Δr: r t 2 = r t 1 · Δr where Δr = r -1 t 1 · r t 2 In the SRG notation, we treat incremental rotations as separate edges, labeled ΔRot. Absolute rotation edges are labeled Rot. The first important transformation of relative orientation is the change of the target coordinate frame. For any given pair of rota- tions r and q let r 0 = r · q be the product of r and q which effectively moves the target coordinate frame of the transformation r. As q is static in typical gyroscope scenarios, we can derive Δr 0 = q -1 · Δr · q. Similarly, if we let r 0 = q · r we can change the source coordi- nate frame of the rotation r. In this case we calculate the resulting incremental rotation in the transformed coordinate frame Δr 0 , again assuming that q is static, as Δr 0 = Δr This means that incremental rotations are valid for all source coor- dinate frames that are connected by static transformations. The third transformation of incremental rotation we need is the inversion, i.e. the exchange of source and target coordinate frames. In order to compute Δr 0 of r 0 = r -1 , we also need to know the absolute orientation: Δr 0 = r t 1 · Δr -1 · r -1 t 1 The resulting spatial relationship patterns for transforming incre- mental rotation are displayed in figures 1 (a)-(c). 4 GYROSCOPE ALIGNMENT Before the gyroscope can be fused with another tracking system, we need to compute the unknown but static transformation between the gyroscope and the object it is attached to. This is an instance of the so-called “hand-eye calibration” problem [1], for which the robotics community has developed a number of solutions. As only the rotation part needs to be computed, we use the first step of the Tsai-Lenz [6] algorithm, which is based on quaternions and is easy to implement. The spatial relationship pattern of the gyroscope cal- ibration is shown in figure 1 (d). 5 GYROSCOPE FUSION In order to fuse the incremental gyroscope measurements with those of an absolute tracker we use an extended kalman filter similar to the one described by Azuma [2]. However, we distinguish be- tween the two cases of outside-in and inside-out tracking. While the outside-in case is straightforward, the inside-out fusion filter uses a 283 IEEE Virtual Reality 2008 8-12 March, Reno, Nevada, USA 978-1-4244-1971-5/08/$25.00 ©2008 IEEE

Upload: gudrun

Post on 11-Mar-2017

219 views

Category:

Documents


6 download

TRANSCRIPT

Page 1: [IEEE 2008 IEEE Virtual Reality Conference - Reno, NV, USA (2008.03.8-2008.03.12)] 2008 IEEE Virtual Reality Conference - Integrating Gyroscopes into Ubiquitous Tracking Environments

Integrating Gyroscopes into Ubiquitous Tracking EnvironmentsDaniel Pustka∗ Manuel Huber† Gudrun Klinker‡

Institut fur Informatik, Technische Universitat Munchen

ABSTRACT

It is widely recognized that inertial sensors, in particular gyro-scopes, can improve the latency and accuracy of orientation track-ing by fusing the inertial measurements with data from other sen-sors. In our previous work, we introduced the concepts of spa-tial relationship graphs and spatial relationship patterns to formallymodel multi-sensor tracking setups and derive valid applications ofwell-known algorithms in order to infer new spatial relationshipsfor tracking and calibration.

In this work, we extend our approach by providing additionalspatial relationship patterns that transform incremental rotationsand add gyroscope alignment and fusion. The usefulness of theresulting tracking configurations is evaluated in two different sce-narios with both inside-out and outside-in tracking.

Keywords: Augmented Reality, Tracking, Calibration, Sensor Fu-sion, Gyroscopes, Inertial Sensors

Index Terms: H.5.1 [Information Interfaces and Presentation]:Multimedia Information Systems—Artificial, augmented, and vir-tual realities; I.3.1 [Computer Graphics]: Hardware Architecture—Input devices

1 INTRODUCTION

In our previous work, we introduced the concepts of spatial rela-tionship graphs (SRGs) [3] and spatial relationship patterns [4],which allow for formally modeling relationships between the dif-ferent coordinate frames in a tracking setup and for describing theoperations performed by a tracking/calibration algorithm. The goalof this work is to derive new spatial relationship patterns that allowus to integrate gyroscopes into our formal framework and to showthat this results in useful sensor fusion configurations.

Related Work Hybrid tracking setups, consisting of inertialsensors combined with other means of tracking, are a well-studiedfield of research. Azuma [2] has introduced gyroscopes into AR,and the topic has been further investigated by many others. Inertialsensing has also enjoyed some attention in the robotics community.

2 SPATIAL RELATIONSHIP GRAPHS AND PATTERNS

A Spatial Relationship Graph (SRG) is a graph which captures thestructure of a tracking environment. The nodes of the graph rep-resent the coordinate frames or orientation free points of real orvirtual objects, while the directed edges represent the actual infor-mation about relationships between these objects.

A Spatial Relationship Pattern represents a subgraph templatewhich captures the structural properties of an algorithm for trackingor calibration. Patterns have input (dashed lines) and output (solidlines) edges, that describe the new relationships that can be inferredfrom given data by a particular algorithm. Starting from an initial

∗e-mail:[email protected]†e-mail:[email protected]‡e-mail:[email protected]

SRG with only the pure tracking data, a chain of pattern applica-tions can be used to construct a dataflow network that computes aparticular relationship at runtime [4].

3 INCREMENTAL ROTATION

We start the discussion of gyroscope integration by deriving somebasic patterns for transforming incremental rotations. Given twosequential rotations rt1 and rt2 at times t1 and t2, we can express rt2by rt1 multiplied by an incremental rotation ∆r:

rt2 = rt1 ·∆r where ∆r = r−1t1 · rt2

In the SRG notation, we treat incremental rotations as separateedges, labeled ∆Rot. Absolute rotation edges are labeled Rot.

The first important transformation of relative orientation is thechange of the target coordinate frame. For any given pair of rota-tions r and q let r′ = r ·q be the product of r and q which effectivelymoves the target coordinate frame of the transformation r. As q isstatic in typical gyroscope scenarios, we can derive

∆r′ = q−1 ·∆r ·q.

Similarly, if we let r′ = q · r we can change the source coordi-nate frame of the rotation r. In this case we calculate the resultingincremental rotation in the transformed coordinate frame ∆r′, againassuming that q is static, as

∆r′ = ∆r

This means that incremental rotations are valid for all source coor-dinate frames that are connected by static transformations.

The third transformation of incremental rotation we need is theinversion, i.e. the exchange of source and target coordinate frames.In order to compute ∆r′ of r′ = r−1, we also need to know theabsolute orientation:

∆r′ = rt1 ·∆r−1 · r−1t1

The resulting spatial relationship patterns for transforming incre-mental rotation are displayed in figures 1 (a)-(c).

4 GYROSCOPE ALIGNMENT

Before the gyroscope can be fused with another tracking system,we need to compute the unknown but static transformation betweenthe gyroscope and the object it is attached to. This is an instanceof the so-called “hand-eye calibration” problem [1], for which therobotics community has developed a number of solutions. As onlythe rotation part needs to be computed, we use the first step of theTsai-Lenz [6] algorithm, which is based on quaternions and is easyto implement. The spatial relationship pattern of the gyroscope cal-ibration is shown in figure 1 (d).

5 GYROSCOPE FUSION

In order to fuse the incremental gyroscope measurements with thoseof an absolute tracker we use an extended kalman filter similarto the one described by Azuma [2]. However, we distinguish be-tween the two cases of outside-in and inside-out tracking. While theoutside-in case is straightforward, the inside-out fusion filter uses a

283

IEEE Virtual Reality 20088-12 March, Reno, Nevada, USA978-1-4244-1971-5/08/$25.00 ©2008 IEEE

Page 2: [IEEE 2008 IEEE Virtual Reality Conference - Reno, NV, USA (2008.03.8-2008.03.12)] 2008 IEEE Virtual Reality Conference - Integrating Gyroscopes into Ubiquitous Tracking Environments

A B∆Rot

Rot

∆Rot

A BRot

∆Rot

Rot A B6D

∆Rot

6D

Gyro World

Tracker World

Rotstatic

Tracked ObjectGyro

Rotstatic

∆Rot∆Rot2..*

A B Cq: Rotstatic ∆r: ∆Rot

∆r’: ∆Rot

A B C∆r: ∆Rot q: Rotstatic

∆r’: ∆Rot

(a) (b)

(c)

(d)

(e) (f)

Figure 1: New spatial relationship patterns for dealing with incremental rotations: (a) target coordinate transformation, (b) source coordinatetransformation, (c) incremental rotation inversion, (d) gyroscope calibration, (e) outside-in fusion, (f) inside-out fusion

different motion model that explicitly takes into account the factthat rotation of the object also results in a translation of the fixedworld coordinate frame as observed by the tracker. The spatial re-lationship patterns for gyroscope outside-in respectively inside-outfusion are shown in figures 1 (e) and (f). It can be seen that forinside-out tracking, the gyroscope edge is inverted, compared tothe absolute tracker.

6 EVALUATION

We evaluated the gyroscope calibration and fusion in two differentscenarios. We first attached the gyroscope to a camera that trackeda square marker, similar to the AR-Toolkit. Four different cam-era motion sequences were recorded and the prediction error wascomputed, i.e. how well the Kalman filter was able to predict thenext measurement of the absolute tracker. Table 1 shows that usingthe gyroscope improves the prediction in all cases. Furthermore,the inside-out motion model is significantly better in the “still” se-quence, where the camera does not move, and is placed directly infront of the marker, resulting in poor orientation estimation.

outside-in inside-outw/ gyro w/o gyro w/ gyro w/o gyro

still 22.8 68.0 1.2 1.2slow rotation 2.6 5.6 3.0 5.2fast rotation 9.1 94.1 10.8 91.0full motion 6.5 11.0 7.3 9.3

Table 1: Average prediction error in pixels

In the second evaluation scenario, the gyroscope was attached toa head-mounted laser projector [5], tracked by an A.R.T. infraredoutside-in tracker. Figure 2 shows the projected image during atypical head rotation. The registration with the target hole is signif-icantly improved when using the gyroscope (square) compared tousing the outside-in tracking alone (triangle).

ACKNOWLEDGEMENTS

The authors wish to thank Bjorn Schwerdtfeger for providing thehead-mounted laser projector. This work was supported by theBayerische Forschungsstiftung (project TrackFrame, AZ-653-05)and the PRESENCCIA Integrated Project funded under the Euro-pean Sixth Framework Program, Future and Emerging Technolo-gies (FET) (contract no. 27731).

Figure 2: Projection while a sideways rotation is performed

REFERENCES

[1] M. Aron, G. Simon, and M.-O. Berger. Handling uncertain sensor datain vision-based camera tracking. In Proceedings of the Third Interna-tional Symposium on Mixed and Augmented Reality (ISMAR’04), 2004.

[2] R. Azuma and G. Bishop. Improving static and dynamic registrationin an optical see-through hmd. In SIGGRAPH ’94: Proceedings ofthe 21st annual conference on Computer graphics and interactive tech-niques, pages 197–204, New York, NY, USA, 1994. ACM Press.

[3] J. Newman, M. Wagner, M. Bauer, A. MacWilliams, T. Pintaric,D. Beyer, D. Pustka, F. Strasser, D. Schmalstieg, and G. Klinker. Ubiq-uitous tracking for augmented reality. In Proc. IEEE International Sym-posium on Mixed and Augmented Reality (ISMAR’04), Arlington, VA,USA, Nov. 2004.

[4] D. Pustka, M. Huber, M. Bauer, and G. Klinker. Spatial relationshippatterns: Elements of reusable tracking and calibration systems. InProc. IEEE International Symposium on Mixed and Augmented Reality(ISMAR’06), October 2006.

[5] B. Schwerdtfeger and G. Klinker. Hybrid information presentation:Combining a portable augmented reality laser projector and a conven-tional computer display. In Proc. Shortpapers and Posters of 13th Eu-rographics Symposium on Virtual Environments, 10th Immersive Pro-jection Technology Workshop(IPT - EGVE 2007), July 2006.

[6] R. Tsai and R. Lenz. Real time versatile robotics hand/eye calibrationusing 3d machinevision. In IEEE International Conference on Roboticsand Automation, volume 1, pages 554–561, 1988.

284