alvaro cassinelli / meta perception group leader

50
Watanabe Laboratory

Upload: alvaro-cassinelli

Post on 08-Sep-2014

253 views

Category:

Technology


1 download

DESCRIPTION

Some of my recent research topics at the Meta-Perception group at the Ishikawa-Watanabe laboratory (http://www.k2.t.u-tokyo.ac.jp/index-e.html) - The Physical Cloud - Zero-delay, Zero-mismatch spatial AR with Laser Sensing Display - Augmented Perception (Link to videos in the comments)

TRANSCRIPT

  • Watanabe Laboratory

Ishikawa-Watanabe lab: meta perception group By Alvaro Cassinelli, Assistant Professor (Meta Perception Group Leader) I. The physical cloud (tangible gigabytes) Space and objects as scaffold for data Proprioceptive interfaces / Deformable interfaces Multi-Modal Augmented Reality Latest research areas: II. Enabling technologies (sensing/projection, I/O) Zero-delay, zero-mismatch spatial AR Minimal, ubiquitous & context aware interactive displays Laser displays, pneumatic slow displays, roboptics III. Mediated Self / Augmented Perception /Prosthetics Augmented sensing & expression Electronic travel aids, Wearables how to interact with the digital world? enabling new realities Integrate technology and human life art research I. The Physical cloud: data haunted reality I. The physical cloud (tangible gigabytes) Space and objects as scaffold for data Proprioceptive interfaces / Deformable interfaces Multi-Modal Augmented Reality II. Enabling technologies (sensing/projection, I/O) Zero-delay, zero-mismatch spatial AR Minimal, ubiquitous & context aware interactive displays Laser displays, pneumatic slow displays, roboptics III. Mediated Self / Augmented Perception /Prosthetics Augmented sensing & expression Electronic travel aids, Wearables (b) Data on Objects (objects as intuitive handles or icons : beyond PUI) (a) Data on 3D space Experiments on psychology space perception and organization The body as a bookshelf or body mnemonics(*) Shared Memory Palace (interpersonal spatialized database) The city, public spaces, etc. as a 3d bookshelf Experiments on psychology background picture as organizational scaffold for files and folders (Takashita-kun) Association by perceived affordances (Invoked computing) How to interact with just coordinates in space? Propioceptive interfaces (VSD, Krhonos, Virtual HR) Deformable workspace metahor Spidar Screen Virtual BookShelf (Takashita-san) Real time projection mapping twitterBrain (Philippe, Jordi) How to interact with objects that do not have I/O interfaces? Invoked Computing / function projector Objects with memory of action (thermal camera) On the flight I/O interfaces (LSD technology) Research Concept Enabling Technology (*) Jussi Angesleva, 2004 Memory Blocks (2011~) Laser Sensing Display (create the sense of presence / mixed reality) Physical Cloud? (a) Interacting with floating data the Interaction Metaphor KHRONOS PROJECTOR [2005] Deformable screen / fixed in space Screen as a controller KHRONOS PROJECTOR (media arts, 2004), video excerpt Fixed screen / deformable (passive haptic feedback) Screen as a membrane between real and virtual Precise control possible [2008] a physical attribute makes manipulation more precise, even if this is PASSIVE FORCE FEEDBACK Rigid screen / moving on space (proprioception) Screen as a controller Volumetric data visualization & interaction [2006] - retro-reflective paper to reflect IR light Technical details: pose estimation markers - used both for pose estimation and control (drag, slice, zoom...) - Off-the-shelf, camera-projector setup Future directions Fish Tank VR with head tracking FTIR multitouch force feedback display? Memory block application: the Portable Desk [2013] Classic Jazz Rock Application to Neurosciences: twitterBrain [2013~] Multimodal virtual presence of data Extension: portable memory block to store personal data (music, books, etc) Room, public space as virtual bookshelf Common virtual support to query academic publications Real time communication between researchers Proprioception, spatial memory, procedural memory A shared physical database Spatialized database (for neurosciences) Spatialized academic database Spatialized social network big data visualization techniques spatialized sountrack library Augmented Memories [J. Puig, 2012) From Deformable workspace to Deformable User Interface [2014] Skeuomorph, Shape and meaning (beyond the soap bar smartphone) Haptic interaction with virtual 3d objects embedded in the real world force field Virtual Haptic Radar [2008] - Importance of multi-modal immersion (or partial immersion) - High Speed Gesture UI for 3d Display (zSpace) [2013] - Importance of instant feedback to feel virtual objects as real - Multimodal Spatial Augmented Reality (MSAR) How to generate a sense of co-located presence for this information layer beyond projection-mapping? Projection mapping (image) Sound, vibration, temperature Minimal, zero-delay, zero-mismatch interaction Towards a Function Projector capable of projecting affordances! (b) Interacting with (augmented) objects? Towards a function projector Invoked Computing (Function Projector) [2011] Augmented Reality as the graphic front-end of Ubiquity. And Ubiquity as the killer-app of Sustainability. Bruce Sterling (Wired Blog on Invoked Computing) Invoked Computing (video excerpt) Visual and Tactile + high speed interaction [2012] Physical affordances as services?? rechargeable hammer erasable book ...utopian or dystopian future? II. Enabling technologies I. The physical cloud (tangible gigabytes) Space and objects as scaffold for data Proprioceptive interfaces / Deformable interfaces Multi-Modal Augmented Reality II. Enabling technologies (sensing/projection, I/O) Zero-delay, zero-mismatch spatial AR Minimal, ubiquitous & context aware interactive displays Laser displays, pneumatic slow displays, roboptics III. Mediated Self / Augmented Perception /Prosthetics Augmented sensing & expression Electronic travel aids, Wearables Zero-delay, zero-mismatch for MSAR The importance of real time in HCI 1000fps 30ms UI 30fps 200ms Computer or Game OmniTouch (Microsoft research) Enabling Technology : Smart Laser Sensing [2003~] no delay, no misalignment projection on mobile, deformable surfaces Real time sensing Vision based Smart sensing vs. Skin Games What? How? Smart sensing Laser Sensing Display AR surveying (distances, angles, depth...) ubiquitous display medical imaging (IR, polarization...) image enhancement (contrast compensation, color..). Lots of applications demonstrated Camera-less active tracking principle Markerless laser tracking (I/O interface) [2003] 2004 2004 artificial synesthesia real-time interaction new interfaces for musical expression scoreLight: a human sized pick-up head [2009] (in collaboration with Daito Manabe) Skin Games Body as a controller (kinect) & body as a display surface Cameraless interactive display (no calibration needed) Laser GUI: minimal interface [2013~] Text projection-mapping at around 200 fps Minimal, interactive displays vs. pixelated projection mapping? WORKSHOP at CHI 2013: Steimle J., Benko H., Cassinelli A., Ishii H., Leithinger D., Maes P., Poupyrev I.: Displays Take New Shape: An Agenda for Future Interactive Surfaces. CHI13 Extended Abstracts on Human Factors in Computing, ACM Press, 2013. minimal displays by no means imply small Laserinne [2009] Real-world special effects Real world shader? Saccade Display + laser sensing [2012~] Why laser? Stronger persistence of vision effect Very long distances! (on a car, etc) example of a context aware display [+DIC] video Light pet Can sense, measure Draw, print Signal (danger), indicate Play Robot made of light Extension of the self Minimal interactive display III. Mediated Self / Augmented Perception I. The physical cloud (tangible gigabytes) Space and objects as scaffold for data Proprioceptive interfaces / Deformable interfaces Multi-Modal Augmented Reality II. Enabling technologies (sensing/projection, I/O) Zero-delay, zero-mismatch spatial AR Minimal, ubiquitous & context aware interactive displays Laser displays, pneumatic slow displays, roboptics III. Mediated Self / Augmented Perception /Prosthetics Augmented sensing & expression Electronic travel aids, Wearables What is an interface? extension of the Self? Haptic Radar for extended spatial awareness [2006] optical antennae for human, not devices New sensorial modality (this is not TVSS) Extension of the body, 360 degrees Haptic Radar & HaptiKar (video excerpts) Ongoing experiments (50 blind people in Brasil) In collaboration with Professor Eliana Sampaio (CNAM, Paris) Quantitative measures (using a simulator calibrated magnetic compass, and virtual reality environment) Production? Presently we are doing: Qualitative results were extraordinary (semi-structured interviews & ANOVA analysis of anxiety trait/state). work on wearable computing Laser Aura: externalizing emotions [2011] Example of a minimal display inspired by manga graphical conventions Real space as an opportunity to organize data (vs. the cloud) Real objects as handles to trigger computing functions The vision: Physical Cloud (computing) Summarizing Minimal, context aware displays (less is more!) Engineered Intuitive physics (generalization of tangible interfaces) Intentional stance (live agents instead of control knobs) Real time, zero-delay, zero-mismatch Spatial AR will produce a paradigm shift making the digital analog again Enabling technology? Possible instantiation: the light-pet vs. pixelated screens an alternative vision to the pixelated screen (including HMD) avatar robot made of light communication enhancement (laser aura) enhanced spatial awareness (signaling, etc) Thanks! For more: www.alvarocassinelli.com