subjective evaluation of a semi-automatic optical see- through...

45
Subjective Evaluation of a Semi-Automatic Optical See- Through Head-Mounted Display Calibration Technique Kenneth Moser Mississippi State University Yuta Itoh Technical University of Munich Kohei Oshima Nara Institute of Science & Technology J. Edward Swan II Mississippi State University Gudrun Klinker Technical University of Munich Christian Sandor Nara Institute of Science & Technology March 26, 2015 IEEE Virtual Reality 2015 Arles, France

Upload: others

Post on 29-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Subjective Evaluation of a Semi-Automatic Optical See-Through Head-Mounted Display Calibration Technique

Kenneth MoserMississippi State University

Yuta ItohTechnical University of Munich

Kohei OshimaNara Institute of Science & Technology

J. Edward Swan IIMississippi State University

Gudrun KlinkerTechnical University of Munich

Christian SandorNara Institute of Science & Technology

March 26, 2015

IEEE Virtual Reality 2015Arles, France

Page 2: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic
Page 3: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Image Plane

Pin-HoleCamera

Virtual Camera Frustum

Far Clipping Plane

Wearer’s Eye

Optical See-Through HMD View

HMD Optical Combiner

OST HMD View with ImageryImproper Registration Proper Registration

OST-HMD Calibration

IEEE Virtual Reality 2015Arles, France

Page 4: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Less Accurate Registration More Accurate Registration

Video Taken Through Camera Set Within The Display

IEEE Virtual Reality 2015Arles, France

OST-HMD Calibration

Page 5: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

IEEE Virtual Reality 2015Arles, France

Screen Point

World Point

Single Point Active Alignment Method (SPAAM)

(Tuceryan & Navab, 2000)

tHead

tWorld

tPoint

tH-P

Point

tH-P

Screen Pixel (x,y) tH-P

Screen Pixel (x,y)

tH-P

Screen Pixel (x,y)

Calibration Methods

Page 6: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

IEEE Virtual Reality 2015Arles, France

Calibration Methods

Page 7: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

EyeTracker

World

Eye

tWE

Camera

Interaction Free Display Calibration (INDICA)

(Itoh & Klinker, 2014)

Recycled INDICA: Updates Calibration Matrix With Eye Location

Calibration Methods

IEEE Virtual Reality 2015Arles, France

Page 8: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Swirski, 2012

Calibration Methods

IEEE Virtual Reality 2015Arles, France

Interaction Free Display Calibration (INDICA)

(Itoh & Klinker, 2014)

Eye Center Locations Determined Through Limbus Detection

Nitschke, 2013

Page 9: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

• SPAAM:

20 screen - world alignments

taken over 1.5m – 3.0m distance to world point

• Degraded SPAAM:

reuse of SPAAM result

HMD removed and replaced

• Recycled INDICA:

Reuse intrinsic values from SPAAM calibration

Combine updated Eye Position for final result

Calibration Methods Evaluated

IEEE Virtual Reality 2015Arles, France

Page 10: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

perceived location error

Subjective Evaluation Metrics

IEEE Virtual Reality 2015Arles, France

Virtual Objects in RED

Page 11: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

1 2 3 4 5

1 2 3 4

diagram provided to subject before start of each trial set

Subjective Evaluation Metrics

IEEE Virtual Reality 2015Arles, France

quality of registration with perceived location

Page 12: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

variance of SPAAM alignment point reprojection

Reprojection – Transformation of

the 3D world points back into

screen points using calibration

projection matrix result

Screen Point

World Point

. . . . . . . .

. . . . . . . .

. . . . . . . .X

3x4 Projection Matrix

Reprojected Screen Point

Vertical Screen Space

Horizontal Screen Space

Quantitative Evaluation Metrics

IEEE Virtual Reality 2015Arles, France

Page 13: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

eye

Δ Z

Δ X

Quantitative Evaluation Metrics

variance in eye location estimates

IEEE Virtual Reality 2015Arles, France

tracking camera

Page 14: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Binocular DisplayLeft Eye (Monocular)

NVIS ST50

1280 x 1024 per eye

HFOV 40⁰ / VFOV 32 ⁰

System Hardware

Right Eye Piece (Covered)

Page 15: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Logitech QuickCam Pro

USB 2.0 Interface

Auto-Focus Disabled

World Tracking (head)

640 x 360 30fps

Eye Localization (left eye)

1280 x 720 (still images)

System Hardware

Page 16: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

4 x 4 Grid of Pillars – 4cm spacing

4cm

Real Pillar

Real Heights: 13.5 – 19.5cm

Virtual Height:15.5cm

Pillar Evaluation Tasks

IEEE Virtual Reality 2015Arles, France

verbally state location and registration quality

Z

X

Page 17: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Horizontal Grid

Vertical Grid

Cube Evaluation Tasks

IEEE Virtual Reality 2015Arles, France

20 x 20 Grids of Squares

Virtual Cube: 2cm x 2cm x 2cm

verbally state location and registration quality

Z/Y

X

Page 18: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Tracking Performed by Ubitrack

Huber et al., 2007

System Hardware

Within-Subjects Design

13 Subjects (6 male / 7 female)

22 – 26 years of age

No prior experience with HMD’s

Page 19: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

SPAAM Extrinsic Parameters derived through QR Decomposition

Quantitative Result – Eye Location Estimation

Page 20: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Horizontal Screen Space

Quantitative Result – Reprojection Variance

IEEE Virtual Reality 2015Arles, France

Page 21: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Vertical Screen Space

Quantitative Result – Reprojection Variance

IEEE Virtual Reality 2015Arles, France

Page 22: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

IEEE Virtual Reality 2015Arles, France

Difference in cm between perceived and actual location

0: No error (perfect registration)

+X: Virtual perceived to the right

- X: Virtual perceived to the left

Y

ZX

Subjective Result – Location Error

+Y: Virtual perceived above

- Y: Virtual perceived below

+Z: Virtual perceived further

-Z: Virtual perceived closer

Page 23: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Difference Between Perceived & Actual Location

Group A

Significance in ZF(2, 24) = 14.011

p < .001

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test

Group B

No Significance*

Subjective Result: Location Error – Pillars

IEEE Virtual Reality 2015Arles, France

Page 24: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Group B

No Significance*

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test**Mauchly’s test indicated non-sphericity, p value adjusted by Huynh – Feldt ε

Group A

Significance in YF(2, 24) = 10.96

p < .0016 ε = .75**

Difference Between Perceived & Actual Location

IEEE Virtual Reality 2015Arles, France

Subjective Result: Location Error – Cubes Vertical Grid

Page 25: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Group B

No Significance*

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test

Group A

Significance in ZF(2, 24) = 7.37

p < .003

IEEE Virtual Reality 2015Arles, France

Subjective Result: Location Error – Cubes Horizontal Grid

Difference Between Perceived & Actual Location

Page 26: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

IEEE Virtual Reality 2015Arles, France

Subjective Result: Location Error

Quantitative Measures Not a

Performance Prediction

No Difference SPAAM/DSPAAM

No Difference All Algorithms in X

Highest Overall Error in Z

INDICA Significantly Better Y/Z

Page 27: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Registration Quality with Chosen Location

Group A

No Significance*

Group B

Significance in QualityANOVA: F(2, 24) = 5.03, p < .015Friedman: X2(2) = 5.45, p < .066

Kruskal-Wallis: X2(2) = 18.92, p < .001

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test

Subjective Result: Registration Quality – Pillars

Page 28: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Group A

No Significance*Friedman: X2(2) = .15

Kruskal-Wallis: X2(2) = .98

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test

Registration Quality with Chosen Location

Subjective Result: Registration Quality – Cubes Vertical

Page 29: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Group B

No Significance*

Group A

Significance in QualityANOVA: F(2, 24) = 6.65, p < .013, ε = .71**Friedman: X2(2) = 13.06, p < 0.0015

Kruskal-Wallis: X2(2) = 21.21, p < 0.001

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test**Mauchly’s test indicated non-sphericity, p value adjusted by Huynh – Feldt ε

Registration Quality with Chosen Location

Subjective Result: Registration Quality – Cubes Horizontal

Page 30: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Registration Quality with Chosen Location

Subjective Result: Registration Quality

Quality Values Match Performance Measures

INDICA Quality is Equal or Better Than SPAAM

INDICA Quality Significantly Better in Z

IEEE Virtual Reality 2015Arles, France

Page 31: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Why Apparent Disagreement in Quantitative & Subjective?

Why No Significant Performance Change in SPAAM/DSPAAM?

Poor Eye Localization for INDICA?

Results Discussion

IEEE Virtual Reality 2015Arles, France

Removal/Replacement of HMD Between Conditions

Reprojection Shows Closer to Actual Pixels Used in Alignment

HMD Specific Properties

Resolution of Task Not High Enough to Find Significance

Eye Location Values for INDICA Show Low Variance

Fit on User’s Head

Exit Pupil Location

INDICA Presumes a Simplistic HMD Model

Page 32: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

INDICA – Equal or Superior performance to SPAAM

• Significantly higher performance in Y/Z

• Significantly higher quality in Z

• Recycled INDICA requires SPAAM intrinsics

• Minimal requirement from user

• Less time to perform (user preferred)

SPAAM / Degraded SPAAM – almost no difference

• Removal/Replacement little effect on accuracy

• Accuracy in X equal to INDICA

• Less favorable method (exit survey)

Take Away

IEEE Virtual Reality 2015Arles, France

Page 33: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Evaluation of Full INDICA

Remove induced error from SPAAM intrinsics (Full INDICA)

Utilize more robust eye localization (Alex’s presentation)

Real time update of calibration (on-line)

Binocular TaskMore relevant depth cue

Verification of SPAAM Z error

Best VS Best

Comparison of best possible calibrations SPAAM/INDICA

Removal of HMD distortion (Yuta’s presentation)

Improvements to SPAAM to reduce impact of user error

Future Work

Page 34: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Dr. Hirokazu Kato

Dr. Goshiro Yamamoto

Everyone at the Interactive Media Design Lab

All of the Anonymous Subjects

Special Thanks

IEEE Virtual Reality 2015Arles, France

Page 35: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

NSF Awards: IIA-1414772, IIS-1320909, IIS-1018413

NASA Mississippi Space Grant Consortium Fellowship

European Union Seventh Framework

PITN-GA-2012-316919-EDUSAFE

IEEE Virtual Reality 2015Arles, France

Support

Page 36: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

INDICA – Equal or Superior performance to SPAAM

• Significantly higher performance in Y/Z

• Significantly higher quality in Z

• Recycled INDICA requires SPAAM intrinsics

• Minimal requirement from user

• Less time to perform (eye measures vs alignment)

SPAAM / Degraded SPAAM – almost no difference

• Removal/Replacement little effect on accuracy

• Accuracy in X equal to INDICA

• Less favorable method (exit survey)

Take Away

IEEE Virtual Reality 2015Arles, France

Page 37: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Calibration Methods

tH-P

Screen Pixel (x,y)

tH-P

Screen Pixel (x,y)

tH-P

Screen Pixel (x,y)

IEEE Virtual Reality 2015Arles, France

Single Point Active Alignment Method (SPAAM)

(Tuceryan & Navab, 2000)

Page 38: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Calibration Methods

Recycled Setup

Full Setup

Translation Eye -Tracker tET

Rotation World – Screen RWS

Rotation World – Tracker RWT

Translation World – Tracker tWT

Required for Both

Intrinsic Calib. Params. KE

Translation World – Eye tWE

Translation World – Screen tWS

Translation World – Screen tWS

Pixel Scaling Factor α(x,y)

IEEE Virtual Reality 2015Arles, France

Interaction Free Display Calibration (INDICA)

(Itoh & klinker, 2014)

Page 39: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Evaluation Study

Registration Quality Comparison

• SPAAM

• Degraded SPAAM:

Reuse of SPAAM result

after HMD replacement

• Recycled INDICA:

Reuse intrinsic values

from SPAAM calibration

Algorithms

Perceived VS Intended Location

IEEE Virtual Reality 2015Arles, France

Page 40: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Evaluation Study

Degraded SPAAM & Recycled INDICA rely on values from SPAAM calibration

IEEE Virtual Reality 2015Arles, France

Page 41: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Evaluation Study

Order of D. SPAAM and R. INDICA, as well as Cube/Pillar task presentation,distributed such that no two subjects experienced the same sequence

Within-Subjects Design3 Alg. X 2 Tasks = 6 Conditions16 Pillar Trials/20 Cube Trials per cond.

IEEE Virtual Reality 2015Arles, France

13 Subjects (6 male / 7 female)22 – 26 years of ageNo prior experience with HMD’s

Page 42: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Experimental Results & Discussion

Quantitative Measures – Reprojection Estimates

Screen Point

World Point

SPAAM Calibration Produces

Screen (X,Y) and World (X, Y, Z)

Correspondence Pairs

IEEE Virtual Reality 2015Arles, France

Page 43: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Discrete 2Axis Grids

Each Square 2cm x 2cm

Vertical Axis: A – Z (A-D)Horizontal Axis: 1 – 20 (1-4

Evaluation Tasks

IEEE Virtual Reality 2015Arles, France

Page 44: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Subjective Measures –Registration Quality

Experimental Results & Discussion

Verbal Response from Subject (1-4/5)

Normalized both scales (Pillars/Cubes) into 1-4

Quality values are not Likert scale data – provided

images create reference for quality range

Statistical Analysis on Quality Data:

• ANOVA

• Friedman

oLess power compared to ANOVA

oReduces number of considered data points

• Kruskal-Wallis

oMore power compared to ANOVA

oDoes not consider within subject designIEEE Virtual Reality 2015

Arles, France

Page 45: Subjective Evaluation of a Semi-Automatic Optical See- Through …campar.in.tum.de/pub/itoh2015vr3/itoh2015vr3.slides.pdf · 2015-03-30 · Subjective Evaluation of a Semi-Automatic

Performance Summary – Quantitative Measures

Experimental Results & Discussion

INDICA – Stable Eye Estimates / High Reprojection Variance.

• Manual Limbus Detection Required for best estimates

• Eye position in Z more consistent

• Higher reprojection variance

Reprojection not indication of result quality

INDICA reprojection shows actual pixel used (?)

SPAAM – Extrinsic/Reprojection Values Match Previous Findings

• Extrinsics show Higher variance along Z axis

• Low reprojection variance

SPAAM result closely reproduces SPAAM alignments

IEEE Virtual Reality 2015Arles, France