user-centric design of a vision system for interactive applications stanislaw borkowski, julien...
DESCRIPTION
3 Outline Context: augmented surfaces User-centric approach in vision systems User-centric requirements Implementation SPODs, VEIL, Support services Conclusions & Future WorkTRANSCRIPT
User-Centric Designof a Vision System
forInteractive ApplicationsStanislaw Borkowski, Julien Letessier, François Bérard, and James
L. Crowley
ICVS’06New York, NY, USAJanuary 5, 2006
2
Academic context
• PRIMA group (GRAVIR lab, INRIA)• « Perception, Recognition and Integration for Interactive Environments »
• IIHM group (CLIPS lab, Univ. Grenoble)• « Engineering in Human-Computer Interactions »
3
Outline
• Context: augmented surfaces• User-centric approach in vision systems
• User-centric requirements• Implementation
• SPODs, VEIL, Support services• Conclusions & Future Work
4
credit: F. Bérard, J. Letessier
Context : augmented surfaces• Interacting withprojected images...• direct manipulation• user collaboration• mobility
• ... is not realistic today• limited, controlled conditions• operator requirement• software integration issues
5
Objectives
• Propose a client-centric approach• design of perceptive input systems• two classes of clients :
• end users realize an interaction task• developers create an interactive application
• Application : design an input system• address simple augmented surfaces• feature vision-based, WIMP-like widgets(e.g. press-buttons)
• acheive the usability of a physical input device
6
• Top-down design• Determine client requirements
• consequences of HCI and SOA requirements• user-centric / developer-centric• functional / non-functional
• Service-oriented• def : a service adds value to information• SOA is a collection of communicating services
Approach Overview
7
Developer requirements
• Abstraction : be relevant• make computer vision invisible• generalize the input
• Isolation : allow integration• permit service distribution• support remote access to services• offer code reuse
• Contract : offer quality of service• specify usage conditions• determine service latency, precision, etc.
8
End-user requirements• Typical for "real time" interaction• Latency limits
• upper bound : 50 ms for coupled interaction• lower bound : 1 s for monitoring applications
• Autonomy• ideally, no setup or maintenance• in practice, minimize task disruption
• Reliability / predictability• either real-time or unusable• reproducible user experience
Pragmatic approach• Black-box services• BIP (Basic Interconnection Protocol)
• BIP implementation ≈ SOA middleware• service/service and service/application comm.
• goal 1 : performance• connection-oriented (TCP-based)• low latency (UDP extensions)
• goal 2 : easy integration• service discovery (standards-based)• implementations provided (C++, Java, Tcl)• interoperability ≤ 100 lines of code
10
Our approach
• Abstraction, Isolation : use BIP• advice to service developers
• Contract : nothing enforced• recommend evaluation of hci-centric criteria
• Common ground• allows to create SOA-based prototypes
11
Interactive widgets projected on a portable display surface
12
Luminance-based button widget
S. Borkowski, J. Letessier, and J. L. Crowley. Spatial Control of Interactive Surfaces in an Augmented Environment. In Proceedings of the EHCI’04. Springer, 2004.
13
Touch detection
CIH
)()(:)( tLtLtL io
Locate widget in the camera image
Calculate mean luminance over the widget
Update the state widget state
14
Robustness to clutter
15
Robustness to clutter
16
Assembling occlusion detectors
17
Assembling occlusion detectors
18
Striplet – the occlusion detector
dxdytyxyx t ),,L(),(f)(R gain
Gain
x
x
y
0),(fgain dxdyyx
19
Striplet – the occlusion detector
x
y
0 R
20
Striplet – the occlusion detector
x
y
0 R
21
Striplet-based SPOD
SPOD – Simple-Pattern Occlusion Detector
22
Striplet-based button
23
Striplet-based slider
24
SPOD software components
Camera
Client Application
Calibration
GUI rendering
GUI
Striplets Engine
VEILS
P
O
D
25
VEIL – Vision Events Interpretation Layer
Striplets Engine
VEILS
P
O
D
Inputs• Widgets coordinates • Scale and UI to camera mapping matrix
• Striplets occlusion events
Outputs• Interaction events• Striplets coordinates
26
VEIL – Vision Events Interpretation Layer
Striplets Engine
VEILS
P
O
D
Inputs• Widgets coordinates • Scale and UI to camera mapping matrix
• Striplets occlusion events
Outputs• Interaction events• Striplets coordinates
27
VEIL – Vision Events Interpretation Layer
Striplets Engine
VEILS
P
O
D
Inputs• Widgets coordinates • Scale and UI to camera mapping matrix
• Striplets occlusion events
Outputs• Interaction events• Striplets coordinates
28
Inputs• Striplets UI-coordinates • UI to camera mapping matrix• Images from camera service
Outputs• Occlusion events
Striplets Engine Service
Striplets Engine
VEILS
P
O
D
29
Inputs• Striplets UI-coordinates • UI to camera mapping matrix• Images from camera service
Outputs• Occlusion events
Striplets Engine Service
Striplets Engine
VEILS
P
O
D
30
Inputs• Striplets UI-coordinates • UI to camera mapping matrix• Images from camera service
Outputs• Occlusion events
Striplets Engine Service
Striplets Engine
VEILS
P
O
D
31
VEIL – Vision Events Interpretation Layer
Striplets Engine
VEILS
P
O
D
Inputs• Widgets coordinates • Scale and UI to camera mapping matrix
• Striplets occlusion events
Outputs• Interaction events• Striplets coordinates
32
SPOD-based calculator
Video available at: http://www-prima.inrialpes.fr
33
Conclusions
• We have presented• Service-oriented approach• Implementation
• Future work• Different detector types• More intelligent VEIL• Integration to GML
34
Thank you for your attention
35
Assembling occlusion detectors
36
Striplet – the occlusion detector
0 R
37
Striplet – the occlusion detector
0 R
38
Striplet – the occlusion detector
0 R