using gaze input to navigate a virtual geospatial environment

67
Using Gaze Input to Navigate a Virtual Geospatial Environment Mark Hazlewood Committee Anne Haake (chair) Reynold Bailey

Upload: mark-hazlewood

Post on 27-Jan-2015

109 views

Category:

Education


1 download

DESCRIPTION

This is a presentation for the defense of my capstone project for a Master's of Science in Human-Computer Interaction, from Rochester Institute of Technology. For this project I created an application for navigating a geospatial display through gaze input on a 2D user interface overlay.

TRANSCRIPT

  • 1. Using Gaze Input to Navigate a Virtual Geospatial Environment Mark Hazlewood Committee Anne Haake (chair) Reynold Bailey

2. Defense Outline Capstone project details Prior work Software and user interface design User testing details and results Conclusions and future work 3. Project DetailsPrior WorkSoftware DesignProject Details Capstone project objectives and timelineUser TestingFuture Work 4. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkObjectives Primary objective Develop a software application allowing users to navigate in a virtual geospatial environment using their gaze as input Secondary objectives Attempt using the Kinect sensor as a remote eye tracker Conduct preliminary user evaluations of the developed application 5. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkPlanned TimelineWeek 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Activity Development Integration+Test Participant Recruitment User Testing Analysis, documentation, writeup 6. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkPlanned Timeline Phase 4Phase 3 Phase 2 Phase 1 Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Activity Development Integration+Test Participant Recruitment User Testing Analysis, documentation, writeupPhase 1 Exploratory Development with Kinect Phase 2 Selection of an Eye Tracking System Phase 3 Development of Geospatial Application Phase 4 User Testing 7. Project DetailsPrior WorkSoftware DesignPrior Work Primary References and InspirationUser TestingFuture Work 8. Project DetailsPrior WorkSoftware DesignStellmach, et al Designing Gaze-based User Interfaces for Steering in Virtual Environments (ETRA 12) Evaluated several techniques in gazebased navigation Proposed a taxonomy of gaze-based UI activation methods for navigation Environment was a 3D virtual maze My project built on some of the general goals of Stellmachs research, but specifically applied to a geospatial contextUser TestingFuture Work 9. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkStellmach, et al Stellmachs proposed taxonomy: Input TechniquexActivation SpeedDescriptionDiscretexConstant(DC) Input activated through fixed UI regions at a constant view change rate. Once activated, a particular movement action remains active until toggled.ContinuousxGradient-based(CG) Input activated through fixed UI regions at a variable view change rate. Movement actions are only active when gaze is within the UI element. 10. Project DetailsPrior WorkStellmach, et alSoftware DesignUser TestingFuture Work 11. Project DetailsPrior WorkSoftware DesignAdams, et al The Inspection of Very Large Images by Eye-gaze Control (AVI 08) Proposed multiple methods for gazebased zooming Maintained fixed method for gaze-based panning Used geospatial application as a test-bed, but research focus was on image viewing My project referenced Adams work for the edge-of-screen panning UIUser TestingFuture Work 12. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkAdams, et al Adams zooming techniques: TechniqueDescriptionStare-to-Zoom (STZ)Sustained gaze in the central region of the display causes image to zoom inwards. Requires extended stationary gaze > 420 ms. Zooming continues while gaze remains stationary.Head-to-Zoom (HTZ)Zooming is initiated by movements of the users head (calculated by eye to screen distance). Leaning forward a small amount (~40 mm) initiates zooming in. Leaning backward the same amount zooms out.Dual-to-Zoom (DTZ)Zooming is initiated using the mouse. Left mouse button zooms in, right button zooms out. Panning is still done through gaze.Mouse-to-Zoom (MTZ)Used as a baseline for comparison with other techniques. Both zoom and pan is accomplished using the mouse. 13. Project DetailsPrior WorkSoftware DesignSoftware Design Design of the geospatial user interface and softwareUser TestingFuture Work 14. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkUser Interface Design Hypothesis When navigating large geospatial areas, current zoom level can be used as an indicator of the level of detailed information a user wishes to view Zoomed out Assume user is interested in navigating over large geographic areas, from one broad region to another Zoomed in Assume user is interested in fine searching among smaller geographic landmarks Design Goal Provide an adaptive UI that supports multiple user levels of interest 15. Project DetailsPrior WorkSoftware DesignUser Interface DesignUser TestingFuture Work 16. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkUser Interface DesignZoom In PanZoom Out 17. Project DetailsPrior WorkSoftware DesignUser Interface DesignUser TestingFuture Work 18. Project DetailsPrior WorkSoftware DesignUser Interface DesignUser TestingFuture Work 19. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkUser Interface Design Edge PanZoom InZoom Out 20. Project DetailsPrior WorkSoftware DesignUser Interface DesignUser TestingFuture Work 21. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkUser Interface DesignCenter PanZoom InZoom Out 22. Project DetailsPrior WorkSoftware DesignUser Interface DesignUser TestingFuture Work 23. Project DetailsPrior WorkSoftware DesignUser Interface DesignUser TestingFuture Work 24. Project DetailsPrior WorkSoftware DesignUser TestingUser Interface Design Gaze cursor Displays current (filtered) gaze point to the user Helpful in maintaining orientation when navigating the UIFuture Work 25. Project DetailsPrior WorkSoftware DesignSoftware Design DetailsUser TestingFuture Work 26. Project DetailsPrior WorkSoftware DesignUser TestingGaze Point Filter Problem Even calibrated, raw output from eye tracker is noisy Brief (but valid) fixations contribute to the noisiness Makes UI activation difficult and gaze cursor is very distractingFuture Work 27. Project DetailsPrior WorkSoftware DesignGaze Point Filter Unfiltered outputUser TestingFuture Work 28. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkGaze Point Filter Solution Filter the trackers output prior to processing by the client application 29. Project DetailsPrior WorkSoftware DesignGaze Point Filter Moving Average Filter As 2D points are received by the tracker, samples are added to a queue (window) Average of all samples currently in the window is returned Window size is configurable optimal found to be 15-20 samples After initial charging, resulting output is greatly improvedUser TestingFuture Work 30. Project DetailsPrior WorkSoftware DesignGaze Point Filter Moving Average Filter - chargingUser TestingFuture Work 31. Project DetailsPrior WorkSoftware DesignGaze Point Filter Moving Average Filter - chargingUser TestingFuture Work 32. Project DetailsPrior WorkSoftware DesignUser TestingGaze Point Filter Moving Average Filter output at window size = 10Future Work 33. Project DetailsPrior WorkSoftware DesignUser TestingGaze Point Filter Moving Average Filter output at window size = 25Future Work 34. Project DetailsPrior WorkSoftware DesignUser TestingDetailed Design EyeTrackerAPIWorldWindGazeInputSwingWorldWindFuture Work 35. Project DetailsPrior WorkSoftware DesignUser TestingDetailed Design - EyeTrackerAPIFuture Work 36. Project DetailsPrior WorkSoftware DesignUser TestingDetailed Design - EyeTrackerAPIFuture Work 37. Project DetailsPrior WorkSoftware DesignUser TestingDetailed Design - EyeTrackerAPIFuture Work 38. Project DetailsPrior WorkSoftware DesignUser TestingDetailed Design - EyeTrackerAPIFuture Work 39. Project DetailsPrior WorkSoftware DesignUser TestingDetailed Design - EyeTrackerAPIFuture Work 40. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkDetailed Design - WorldWindGazeInput 41. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkDetailed Design - WorldWindGazeInput 42. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkDetailed Design - WorldWindGazeInput 43. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkDetailed Design - WorldWindGazeInput 44. Project DetailsPrior WorkSoftware DesignUser TestingUser Testing Preliminary user evaluations of gaze input applicationFuture Work 45. Project DetailsPrior WorkSoftware DesignUser TestingUser Testing Goals Evaluate the effectiveness of proposed designs Quantitative Are participants able to use the UI? How effective are they in navigating to geographic regions? Qualitative How natural or intuitive is the experience? Do participants feel like the system responds to their intent?Future Work 46. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkParticipants & Recruiting Planned goal was 5-10 test participants Recruited via online posts (graduate forum) and email solicitation Prospective participants completed an online screener Ended up with eight (8) participants 47. Project DetailsPrior WorkSoftware DesignParticipants & RecruitingUser TestingFuture Work 48. Project DetailsPrior WorkSoftware DesignParticipants & RecruitingUser TestingFuture Work 49. Project DetailsPrior WorkSoftware DesignParticipants & RecruitingUser TestingFuture Work 50. Project DetailsPrior WorkSoftware DesignParticipants & RecruitingUser TestingFuture Work 51. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkTest Procedures 1. Background questionnaire 2. Introduced to eye tracking system, calibration procedure, and tasks 3. Initial calibration 9-point automatic, using iViewX Experiment Center4. Introduced to geospatial application and user interface5. Navigation to practice point (with moderator support) 6. Sequential navigation to test regions (A, B, C, D) 1. 2. 3. 4.Pan to general area of region Zoom to region Activate sub-points in the region Zoom out to furthest level 52. Project DetailsPrior WorkSoftware DesignUser TestingTest Procedures Test regionsFuture Work 53. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkTest Procedures Test regions (subpoints) 54. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkTest Procedures Initial calibration Targeted a < 1 angular error (X and Y) 55. Project DetailsPrior WorkSoftware DesignUser TestingTest Procedures Task ordering Participant 1 2 3 4 5 6 7 8Region Sequence A B C D B C D A C D A B D A B C A B C D B C D A C D A B D A B CFuture Work 56. Project DetailsPrior WorkSoftware DesignUser TestingTest Results - QuantitativeRegion LabelAverage task time (seconds)A138.41B141.43C153.21D151.13Overall Average146.05Future Work 57. Project DetailsPrior WorkSoftware DesignUser TestingTest Results - QuantitativeTask NumberAverage task time (seconds)1179.482146.933135.524122.23Overall Average146.05Future Work 58. Project DetailsPrior WorkSoftware DesignUser TestingTest Results - Qualitative Participants given two surveys after tasks completion Qualitative gaze input survey System Usability Scale (SUS) Then debriefed with directed questions from moderatorFuture Work 59. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkTest Results Gaze input survey results 60. Project DetailsPrior WorkSoftware DesignUser TestingTest Results SUS resultsFuture Work 61. Project DetailsPrior WorkSoftware DesignUser TestingTest Results SUS resultsParticipant 1 2SUS Score362.5 67.5 92.54 5 6 7 8 Average65.0 62.5 77.5 75.0 55.0 69.7Future Work 62. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkConclusions and Future Work 63. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkUser Testing Observations Response to adaptive pan UI Initially somewhat disruptive Edge pan provided larger target surface Preferred when calibration had large angular error Central pan provided finer control and better view of map Generally preferred, except when calibration made activation difficult Expectation of dwell-based operation Before initial exposure to UI, participants expected a dwell-based solution Expected map to pan/zoom to where they were looking 64. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkUser Testing Observations Opposite Pan Problem Many users had a tendency to pan in the exact opposite direction Error was relatively frequent and consistent between participants Debriefing revealed an opposite expectation of pan behavior 65. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkUser Testing Observations Opposite Pan Problem Many users had a tendency to pan in the exact opposite direction Error was relatively frequent and consistent between participants Debriefing revealed an opposite expectation of pan behaviorPan Target 66. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkUser Testing Observations Opposite Pan Problem Many users had a tendency to pan in the exact opposite direction Error was relatively frequent and consistent between participants Debriefing revealed an opposite expectation of pan behaviorObserved ErrorCorrect Pan Pan Target 67. Project DetailsPrior WorkSoftware DesignUser TestingFuture WorkIdeas for Future Work More expansive and rigorous user testing Large variation in qualitative survey responses Larger sample size could yield more concrete significant results Comparative study of effectiveness of various design alternatives Fixed vs. adaptive pan UI Dwell-based activation vs. UI-based Implementation of Adams zoom techniques with adaptive pan UI