what you look at is what you get: eye movement-based object

10
Interaction Museum What you look at is what you get: Eye movement-based object selection Abstract This entry shows some early eye movement-based interaction techniques. The computer identifies where on the display the user is looking and uses that information. For example, if display shows several icons, you might request additional information about one. Instead choosing it with a mouse or keyboard, the computer already knows which icon the user is looking at and gives the information immediately. (File = eqpt_overview.jpg) Keywords: eye movement, gaze, eye tracking, object selection, manual motor disability Detailed description Storyboard (File = eqpt_overview2.jpt) User looks at desired item

Upload: lydieu

Post on 11-Feb-2017

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: What you look at is what you get: Eye movement-based object

Interaction Museum

What you look at is what you get: Eye movement-based object selection

Abstract This entry shows some early eye movement-based interaction techniques. The computer identifies where on the display the user is looking and uses that information. For example, if display shows several icons, you might request additional information about one. Instead choosing it with a mouse or keyboard, the computer already knows which icon the user is looking at and gives the information immediately.

(File = eqpt_overview.jpg)

Keywords: eye movement, gaze, eye tracking, object selection, manual motor disability

Detailed description

Storyboard

(File = eqpt_overview2.jpt)

User looks at desired item

Page 2: What you look at is what you get: Eye movement-based object

Interaction Museum

Text only (same as abstract)

Illustrated text (same as storyboard)

(File = eqpt_detail.jpg)

Eye tracker, located alongside the user, measures user’s line of gaze using a video camera and infrared illuminator, aimed at the eye through a mirror

(File = select_ship.jpg)

Item is selected (color of the ship changes to white) after approximately 200 msec.

(File = osis.conv.png)

Information about the selected item is displayed on the left of the screen

Page 3: What you look at is what you get: Eye movement-based object

Interaction Museum

Multimedia

Selecting a ship using eye movements, selected ship turns white, information about it is shown on the left. Image file = select_ship.jpg Movie file = select_ship.wmv (Click on the image to show the movie)

Using eye controlled pull down menus. Image file = menus.jpg Movie file = menus.wmv

Page 4: What you look at is what you get: Eye movement-based object

Interaction Museum

Set up for experimental evaluation, comparison with mouse selection. Image file = expt_mouse.jpg Movie file = expt_mouse.wmv

Set up for experimental evaluation, eye selection. Image file = expt_eye.jpg Movie file = expt_eye.wmv

Page 5: What you look at is what you get: Eye movement-based object

Interaction Museum

Eye tracking equipment. Image file = eqpt_overview.jpg Movie file = eqpt_overview.wmv

Detail view of eye tracker. Image file = eqpt_detail.jpg Movie file = eqpt_detail.wmv

Page 6: What you look at is what you get: Eye movement-based object

Interaction Museum

Alternative eye tracker, using head-mounted optics. Image file = head_mounted.jpg Movie file = head_mounted.wmv

Usage A user interface based on eye movement inputs has the potential for faster and more effortless interaction than current interfaces, because people can move their eyes extremely rapidly and with little conscious effort. However, people are not accustomed to operating devices in the world simply by moving their eyes. At first, it is empowering to be able simply to look at what you want and have it happen, rather than having to look at it and then point and click it with the mouse. Before long, though, it becomes like the Midas Touch. Everywhere you look, another command is activated; you cannot look anywhere without issuing a command. The challenge in building a useful eye movement interface is to avoid this Midas Touch problem. Carefully designed interaction techniques are thus necessary to ensure that they are not only fast but that use eye input in a natural and unobtrusive way. We try to think of eye position more as a piece of information available to a user-computer dialogue involving a variety of input devices than as the intentional actuation of the principal input device. Some interactions, such as displaying additional details, use a 150-250 millisecond dwell time before an object is selected. For irreversible actions such as deleting a file, we prefer a combination of gaze to indicate the desired object and button press to trigger the action.

Page 7: What you look at is what you get: Eye movement-based object

Interaction Museum

Annotated bibliography Papers about this research: R.J.K. Jacob, "What You Look At is What You Get: Eye Movement- Based Interaction Techniques," Proc. ACM CHI'90 Human Factors in Computing Systems Conference pp. 11-18, Addison-Wesley/ACM Press (1990). R.J.K. Jacob, "The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get," ACM Transactions on Information Systems, Vol. 9(3) pp. 152-169 (April 1991). R.J.K. Jacob, "The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get," ACM Transactions on Information Systems, Vol. 9(3) pp. 152-169 (April 1991) Also reprinted with commentary in Readings in Intelligent User Interfaces, ed. M.T. Maybury and W. Wahlster, Morgan Kaufmann, San Francisco, 1998, pp. 65-83. R.J.K. Jacob, "Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces," pp. 151-190 in Advances in Human-Computer Interaction, Vol. 4, ed. H.R. Hartson and D. Hix, Ablex Publishing Co., Norwood, N.J. (1993) [http://www.cs.tufts.edu/~jacob/papers/hartson.pdf]. R.J.K. Jacob, "What You Look At is What You Get: Using Eye Movements as Computer Input," Proc. Virtual Reality Systems'93 Conference pp. 164-166, SIG Advanced Applications, New York, N.Y. (1993). R.J.K. Jacob, "Eye-gaze Computer Interfaces: What You Look At is What You Get," IEEE Computer, Vol. 26(7) pp. 65-67 (July 1993) [http://www.cs.tufts.edu/~jacob/papers/hot.txt]. R.J.K. Jacob, "Eye Tracking in Advanced Interface Design," pp. 258-288 in Virtual Environments and Advanced Interface Design, ed. W. Barfield and T.A. Furness, Oxford University Press, New York (1995) [http://www.cs.tufts.edu/~jacob/papers/barfield.pdf]. V. Tanriverdi and R.J.K. Jacob, "Interacting with Eye Movements in Virtual Environments," Proc. ACM CHI 2000 Human Factors in Computing Systems Conference pp. 265-272, Addison-Wesley/ACM

Page 8: What you look at is what you get: Eye movement-based object

Interaction Museum

Press (2000) [http://www.cs.tufts.edu/~jacob/papers/chi00.tanriverdi.pdf]. L.E. Sibert and R.J.K. Jacob, "Evaluation of Eye Gaze Interaction," Proc. ACM CHI 2000 Human Factors in Computing Systems Conference pp. 281-288, Addison-Wesley/ACM Press (2000) [http://www.cs.tufts.edu/~jacob/papers/chi00.sibert.pdf]. L.E. Sibert, J.N. Templeman, and R.J.K. Jacob, "Evaluation and Analysis of Eye Gaze Interaction," NRL Report NRL/FR/5513--01-9990, Naval Research Laboratory, Washington, D.C. (2001) [http://www.cs.tufts.edu/~jacob/papers/nrlreport.pdf]. Other work on eye movement-based interaction: R.A. Bolt, "Eyes at the Interface," Proc. ACM Human Factors in Computer Systems Conference pp. 360-362 (1982). T.E. Hutchinson, K.P. White, W.N. Martin, K.C. Reichert, and L.A. Frey, "Human-Computer Interaction Using Eye-Gaze Input," IEEE Transactions on Systems, Man, and Cybernetics, Vol. 19(6) pp. 1527-1534 (1989). J.L. Levine, "Performance of an Eyetracker for Office Use," Comput. Biol. Med., Vol. 14(1) pp. 77-89 (1984). I. Starker and R.A. Bolt, "A Gaze-Responsive Self-Disclosing Display," Proc. ACM CHI'90 Human Factors in Computing Systems Conference pp. 3-9, Addison-Wesley/ACM Press (1990). R. Vertegaal, "The GAZE Groupware System: Mediating Joint Attention in Multiparty Communication and Collaboration," Proc. ACM CHI'99 Human Factors in Computing Systems Conference pp. 294-301, Addison-Wesley/ACM Press (1999). C. Ware and H.T. Mikaelian, "An Evaluation of an Eye Tracker as a Device for Computer Input," Proc. ACM CHI+GI'87 Human Factors in Computing Systems Conference pp. 183-188 (1987). S. Zhai, C.Morimoto, and S. Ihde, "Manual and Gaze Input Cascaded (MAGIC) Pointing," Proc. ACM CHI'99 Human Factors in Computing Systems Conference pp. 246-253, Addison-Wesley/ACM Press (1999).

Page 9: What you look at is what you get: Eye movement-based object

Interaction Museum

Basic research on eye movements in psychology: R.A. Abrams, D.E. Meyer, and S. Kornblum, "Speed and accuracy of saccadic eye movements: Characteristics of impulse variability in the oculomotor system," Journal of Experimental Psychology: Human Perception and Performance, 15, 3, 529-543 (1989). Survey paper on eye movements in HCI: R.J.K. Jacob and K.S. Karn, "Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises (Section Commentary)," pp. 573-605 in The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research, ed. J. Hyona, R. Radach, and H. Deubel, Amsterdam, Elsevier Science (2003) [http://www.cs.tufts.edu/~jacob/papers/ecem.pdf].

See also Eye Tracking Research and Applications Symposium, held biennially, 2000, 2002, 2004 [http://www.e-t-r-a.org/]

History

Products

Intellectual property The work described here is published in the open literature, as cited below.

Contributing authors Interaction Museum entry: Robert J.K. Jacob, Tufts University Yves Guiard, CNRS Universite de la Mediterranee Research: Robert J.K. Jacob, Naval Research Laboratory (currently at Tufts University)

Page 10: What you look at is what you get: Eye movement-based object

Interaction Museum

Manuel A. Perez-Quinones, Naval Research Laboratory (currently at Virginia Tech) Linda E. Sibert, Naval Research Laboratory Vildan Tanriverdi, Tufts University (currently at IBM) James N. Templeman, Naval Research Laboratory Video: John L. Sibert, George Washington University