daugherty measuring vergence over stereoscopic video with a remote eye tracker

Download Daugherty Measuring Vergence Over Stereoscopic Video With A Remote Eye Tracker

Post on 12-Jan-2015




0 download

Embed Size (px)


A remote eye tracker is used to explore its utility for ocular vergence measurement. Subsequently, vergence measurements are compared in response to anaglyphic stereographic stimuli as well as in response to monoscopic stimulus presentation on a standard display. Results indicate a highly significant effect of anaglyphic stereoscopic display on ocular vergence when viewing a stereoscopic calibration video. Significant convergence measurements were obtained for stimuli fused in the anterior image plane.


  • 1. Measuring Vergence Over Stereoscopic Video with a Remote Eye Tracker Brian C. Daugherty Andrew T. Duchowski Donald H. House Celambarasan Ramasamy School of Computing & Digital Production Arts, Clemson University AbstractThe angle between the visual axes is the vergence angle. When a person xates a point at innity, the visual axes are parallel and the A remote eye tracker is used to explore its utility for ocular ver- vergence angle is zero. The angle increases when the eyes converge. gence measurement. Subsequently, vergence measurements areFor symmetrical convergence, the angle of horizontal vergence is compared in response to anaglyphic stereographic stimuli as wellrelated to the interocular distance a and the distance of the point of as in response to monoscopic stimulus presentation on a standardxation from a point midway between the eyes D by the expres- display. Results indicate a highly signicant effect of anaglyphicsion: tan (/2) = a/(2D). Thus, the change in vergence per unit stereoscopic display on ocular vergence when viewing a stereo-change in distance is greater at near than at far viewing distances. scopic calibration video. Signicant convergence measurements were obtained for stimuli fused in the anterior image plane.About 70% of a persons normal range of vergence is used within one meter from the eyes. The angle of vergence changes about CR Categories:I.3.3 [Computer Graphics]: Picture/Im-14 when gaze is moved from innity to the nearest distance for age GenerationDisplay algorithms; I.3.6 [Computer Graphics]: comfortable convergence at about 25 cm. Vergence changes about Methodology and TechniquesErgonomics;36 when the gaze moves to the nearest point to which the eyes can converge. About 90% of this total change occurs when the eyes Keywords: eye tracking, stereoscopic renderingconverge from 1 m.1 Introduction & Background In this paper, vergence measurements are made over anaglyphic stereo imagery. Although depth perception has been studied on desktop 3D displays [Holliman et al. 2007], eye movements were Since their 1838 introduction by Sir Charles Wheatstone [Lipton not used to verify vergence. Holliman et al. conclude that depth 1982], stereoscopic images have appeared in a variety of forms, judgment cannot always be predicted from display geometry alone. including dichoptic stereo pairs (different image to each eye), The only other similar effort we are aware of is measurement of in- random-dot stereograms [Julesz 1964], autostereograms (e.g., the terocular distance on a stereo display during rendering of a stereo popular Magic Eye and Magic Eye II images), and anaglyphic im- image at ve different depths [Kwon and Shul 2006]. Interocular ages and movies, with the latter currently resurging in popularity distance was seen to range by about 10 pixels across three partici- in American cinema (e.g., Monsters vs. Aliens, DreamWorks An- pants. In this paper, we report observations on how binocular an- imation and Paramount Pictures). An anaglyphic stereogram is a gular disparity (of twelve participants) is affected when viewing an composite image consisting of two colors and two slightly differ- anaglyphic stereo video clip. ent perspectives that produces a stereoscopic image when viewed through two corresponding colored lters. Although the elicited perception of depth appears to be effective, relatively little is known How can vergence be measured when viewing a dichoptic computer about how similarly this effect may be on viewers eye movements. animation presented anaglyphically? To gauge the effect of stereo Autostereograms, for example, are easily fused by some, but not bydisplay on ocular vergence, it is sufcient to measure the disparity others. between the left and right horizontal gaze coordinates, e.g., xr xl given the left and right gaze points, (xl , yl ), (xr , yr ) as delivered When the eyes move through equal angles in opposite directions, by current binocular eye trackers. Thus far, to our knowledge, the disjunctive movement, or vergence, is produced [Howard 2002]. Inonly such vergence measurements to have been carried out have horizontal vergence, each visual axis moves within a plane con- been performed over random dot stereograms [Essig et al. 2004]. taining the interocular axis. When the visual axes move inwards, the eyes converge; when the axes move outwards, they diverge. The question that we are attempting to address is whether the ver- The convergent movement of the eyes (binocular convergence),gence angle can be measured from gaze point data captured by a i.e., their simultaneous inward rotation toward each other (cf. di- (binocular) eye tracker. Of particular interest is the measure of rel- vergence denotes the outward rotation), ensures that the projec-ative vergence, that is, the change in vergence from xating a point tion of images on the retina of both eyes are in registration withP placed some distance d behind (or in front of) point F , the each other, allowing the brain to fuse the images into a single per-point at which the visual axes converge at viewing distance D. The cept. This fused percept provides stereoscopic vision of three- visual angle between P and F at the nodal point of the left eye is l , dimensional space. Normal binocular vision is primarily charac- signed positive if P is to the right of the xation point. The same terized by this type of fusional vergence of the disparate retinal im-angle for the right eye is r , signed in the same way. The binocular ages [Shakhnovich 1977]. Vergence driven by retinal blur is distin- disparity of the images of F is zero, since each image is centered guished as accommodative vergence [B ttner-Ennever 1988].uon each eyes visual axis. The angular disparity of the images of Email: P is l r . If F is the binocular subtense of point F and P is {andrewd | dhouse}@cs.clemson.edu the binocular subtense of point P , then = l r = P F . Copyright 2010 by the Association for Computing Machinery, Inc. Thus, the angular disparity between the images of a pair of objects Permission to make digital or hard copies of part or all of this work for personal or is the binocular subtense of one object minus the binocular subtense classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on theof the other (see Figure 1). first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post onGiven the binocular gaze point coordinates reported by the eye servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail tracker, (xl , yl ) and (xr , yr ), an estimate of can be derived fol- permissions@acm.org.lowing calculation of the distance d between F and P , obtained ETRA 2010, Austin, TX, March 22 24, 2010. 2010 ACM 978-1-60558-994-7/10/0003 $10.00 97

2. from about 5.3 to 7.3 cm) [Smith and Atchison 1997]. Vergenceis assumed to be symmetrical (although in our experiments no chinrest was used and so the head was free to rotate, violating this as-sumption; for a derivation of angular disparity under this conditionsee Howard and Rogers [2002]). Viewing distance is assumed toP be D = 50 cm, the operating range of the eye tracker, although thetracker allows head movement within a 30 15 20 cm volume(see below). It is also important to note that the above derivationof angular disparity (vergence) assumes zero binocular disparitywhen viewing point F at the screen surface. Empirical measure- P ments during calibration show that this assumption does not alwaysdhold, or it may be obscured by the noise inherent in the eye trackers(x ,y ) position signal. ll F(x r ,yr )2 MethodologyA within-subjects experiment was conducted to test vergence mea- l surement. One of nine video clips served as the independent vari-Fable in the analysis, a subset of a larger study. Two versions of the r left eye video clip were used to test vergence response. The only differenceD between the two versions was that one was rendered in a standardtwo-dimensional monoscopic format while the other was renderedright eye in a red-cyan anaglyphic stereoscopic format. Ocular angular ver-a gence response served as the dependent variable. The operationalhypothesis was simply that a signicant difference in vergence re-sponse would be observed between watching the monoscopic and Figure 1: Binocular disparity of point P with respect to xation stereoscopic versions of the video. point F , at viewing distance D with (assumed) interocular distance a [Howard and Rogers 2002]. Given the binocular gaze point co- 2.1 Apparatus ordinates on the image plane (xl , yl ) and (xr , yr ) the distance be- tween F and P , d, is obtained via triangle similarity. AssumingA Tobii ET-1750 video-based corneal reection (binocular) eye symmetrical vergence and small disparities, angular disparity is tracker was used for real-time gaze coordinate measurement (and derived (see text).recording). The eye tracker operates at a sampling rate of 50 Hzwith an accuracy typically better than 0.3 over a 20 horizontaland vertical range using the pupil/corneal reection difference [To- via triangle similarity: bii Technology AB 2003] (in practice, measurement error rangesroughly 10 pixels). The eye trackers 17 LCD monitor was a xr xl (xr xl )Dset to 1280 1024 resolution and the stimulus display was maxi- = d =.(D + d) d a (xr xl )mized to cover the entire screen (save for its title bar at the top ofthe screen). The eye tracking server ran on a dual 2.0 GHz AMD For objects in the median plane of the head, l = r so the totalOpteron 246 PC (2 G RAM) running Windows XP. The client dis- disparity is 2 degrees. By elementary geometry, = F P play appli


View more >