3ddi visualization muri uc berkeley and mit. uc- mit 3ddi: overview project pipeline: 3d capture:...

Post on 12-Jan-2016

228 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

3DDI Visualization MURI

UC Berkeley and MIT

UC-MIT

3DDI: Overview

Project pipeline:

3D capture:Modeling,simulation

Rendering 3D Display

Applications:Tele-surgery

TrainingCollaboration

UC-MIT

3DDI: Goals• Direct Interaction: no

gloves or glasses.

• Animated content: interaction in real time.

• Content is real-world: 3D models from live capture and modeling.

Laser scanner

3D display

Virtual object

UC-MIT

Task: 3D capture using range scanner• To build a solid-state, high-accuracy electronic

range-finding scanner.

• The system should serve as a replacement for mechanical scanners and motion-capture devices and be usable indoors and outdoors.

• Desired performance: Outdoors, sub-meter accuracy at 100s of meters,

scans in less than a second. Indoors, millimeter accuracy at several meters, scans

at 20-60 frames/sec.

UC-MIT

Subtasks

• Fabrication, testing and improvement of high-power, flip-bonded VCSEL arrays.

• Integration of scanner components. Design of custom elements (modulator, amplifier and power supplies).

• Purchase and integration of coupling optics.• Illumination demo with VCSEL source,

photomultiplier and CCD.• Wrote code for image sequence processing

and calibration.• Static scan example, integration with dynamic

authoring tool (Steve Chenney).

UC-MIT

3D Imaging System, U.C.B.

Fuji Lens

MCPImaging Optics

CCDVCSEL Array

Power Supply HF Signal

Portable Platform!

IR Light

UC-MIT

• Bottle image: depth range ~ 1.2m. Accuracy ~ 0.3cm

First Scanned image

UC-MIT

Task: Model Capture Using Pose Cameras

• Urban geometry

• Textures/BRDFsfor reillumination

… How can we import 3D scene data quickly and automatically?… Starting point for visualization, design, simulation, teaching.

… Develop effectivesensors, automatedand semi-automatedsoftware tools for rapid environment capture

• Synergistic efforts at UCB: 3D scanner Illumination capture

UC-MIT

Goals of integrated effort• Acquire geo-referenced digital imagery

of MIT campus from ground, air

• Extract building exteriors from imagery,using fully automatic techniques

• Model building interiors semi-automaticallyfrom existing 2D building floorplans

• Attach dense interior phototexturesto geometry, semi-automatically

• Integrate photometrics, interaction,and dynamic simulation from UCB

UC-MIT

Acquisition of geo-referenced imagery• Argus platform performs sensor fusion

of imagery, navigation information

UC-MIT

Geo-referencing of multiple nodes• Currently semi-automated process requir-

ing less than one person-second per image

UC-MIT

Texture extraction• Estimation based on weighted medians

UC-MIT

Progress year 3• Acquire hemispherical interior imagery

• Merge ground, aerial geo-ref’d imagery

• Extension to temporal modeling Continuous site modeling of changing site Test: Building 20 demolition, construction

UC-MIT

Task: Capturing Geometry and Reflectance from Photographs

• Input from Cameras, Pose Cameras, Laser Scanners

• Output to Conventional and 3D Displays

UC-MIT

Progress year 1

• Extend Facade to Parametrized Curved Objects

• Visibility Processing and Real-time Rendering

• Campanile Movie

• High Dynamic Range Photography

UC-MIT

Research Highlights Year 1

• Façade: extended to circularly symmetric objects.

• Façade: Accelerated using α-blending.

Campanile Movie shown at SIGGRAPH’97

UC-MIT

Progress year 2

• Photometric Properties of Architectural Scenes

• Capturing and Using Complex Natural Illumination

• Video Motion Capture

UC-MIT

Research Highlights Year 2• Calculation of radiance with known

(outdoor) illumination:

• Re-rendering under novel lighting:

UC-MIT

Research Highlights Year 2• Rendering synthetic objects into real scenes using

HDR photography. Real+Synthetic objects:

UC-MIT

Research Highlights Year 2

• Acquisition of motion data from video using kinematic models:

UC-MIT

Progress year 3

• Reflectance Recovery from MIT Pose Camera Data

• Inverse Global Illumination

UC-MIT

A Synthetic Sunrise Sequence

5:00am 5:30am 6:00am 6:30am

7:00am 8:00am 9:00am 10:00am

One Day at the End of March

UC-MIT

Inverse Global Illumination Algorithm Developed

Reflectance Properties

Radiance Maps

Geometry Light Sources

UC-MIT

Real vs. Synthetic for Original Lighting

UC-MIT

Real vs. Synthetic for Novel Lighting

UC-MIT

Progress Year 4

• Input Multiple range scans

of a scene Multiple photographs

of the same scene

• Output Geometric meshes of

each object in the scene

Registered texture maps for objects

UC-MIT

Overview

RangeImages

RadianceImages

PointCloud

PointGroups

Meshes SimplifiedMeshes

CalibratedImages

TextureMaps Objects

Registration Segmentation Reconstruction

PoseEstimation

Texture MapSynthesis

UC-MIT

Segmentation Results

UC-MIT

Camera Pose Results

• Accuracy: consistently within 2 pixels• Correctness: correct pose for 58 out of 62 images

UC-MIT

Texture-Mapping and Object Manipulation

UC-MIT

Image-based Modeling and Rendering• 3rd Generation--Vary spatial

configurations in addition to viewpoint and lighting

Novel Viewpoint Novel Viewpoint & Configuration

UC-MIT

Texture-Mapping and Object Manipulation

UC-MIT

Task: Authoring Huge, Dynamic Visual Simulations

• Efficiency Too much time is spent computing needless dynamic

state, and dynamic authoring is not integrated with geometric design.

• Control Physics doesn’t do what an author wants

• Success is measured through speedups and the control of example scenarios.

UC-MIT

3D Capture

Modeling,Simulation

Rendering

3D Display

• Take models from measured data. Eg: architecture

• Author scenarios and simulate the dynamics. Eg: a traffic accident

• Provide dynamic models for efficient rendering.

• Integration example: Simulating with a scanned bottle.

How it relates to MURI

UC-MIT

Year 1: Culling with consistency

• Exploit viewer uncertainty to achieve efficient dynamics culling

• Significant speedups demonstrated: Around 5x for test environments. Arbitrary depending on the world.

• Tools released for VRML authoring.

• Papers in I3D, VRML98 and CGA.

UC-MIT

Year 2 and 3: Directing Scenarios

• Use physical sources of randomness (eg. rough surfaces, variable initial conditions) to direct physical simulations

• Year 2: Directing a single body

• Year 3: Directing multiple interacting bodies

• Along the way: Fast multi-body simulation techniques

UC-MIT

Integration Example: Details

• Captured data and 3d rendering must be linked by an authoring phase.

• Extract radius information from 3D bottle scan, plus estimate of variance.

• Simulate using MCMC to achieve a goal - balls are deflected by bottles to land in the right place.

• Render on autostereoscopic display.

UC-MIT

Task: Integration of Modeling and Simulation• Incorporate data from multiple sources:

Geodetic capture (MIT); floorplan extrusion, instancing (UCB)

• Geometry compilation for responsiveness: Scaleable, persistent proximity/visibility database (UCB,

MIT)

• Natural, extensible constraint-based interaction Object associations framework (UCB)

• Physically-based kinematics: Fire simulation (UCB; shown in ‘98) Impulse-response simulation (UCB)

UC-MIT

Several generations of system components:• 1990-93: WalkThrough system (UCB)

Rapid visualization of complex models

• 1993-94: Radiosity integration (Princeton) Diffuse illumination throughout model

• 1994-95: Object associations (UCB) Natural object instancing & placement

• 1994-97: FireWalk, Impulse (UCB) Physically-based fire, kinematic simulations

• 1996-99: Façade, Skymaps (UCB) High-fidelity photo-assisted modeling

• 1996-99: City Scanning (MIT) Acquisition of extended urban models

UC-MIT

Dataset Integration: Geo-referencing• Argus data is geodetically registered

UC-MIT

Dataset Integration: Exterior structure• Exteriors in UCB FireWalk framework

UC-MIT

Integration of UCB object associations• Infrastructure supports editing at any scale

UC-MIT

Exterior to interior transition

• Seamless transition to Tech Square interior

UC-MIT

Transition: building approach

• Gravity association keeps us to local ground

UC-MIT

Visibility modifications: exterior, interior

• Cell-portal visibility applies throughout

UC-MIT

Door passages using object assocations

• Opening doors to allow passage

UC-MIT

Integration of UCB floorsketch, firewalk

• Tech Square interiors modeled by procedural floorplan extrusion, furniture instancing

UC-MIT

Integration of UCB Impulse-Response• Automated generation of RBL objects

Requires specification as union of convex parts

• Initial integration: population, visualization

UC-MIT

Extension to Impulse: sleeping objects• Added “sleep state” for objects coming to

rest

UC-MIT

Extension to Impulse: interaction• Added interactive application of forces

UC-MIT

Task: Novel 3D Displays

• Re-design the MIT holographic-video display for heightened utility.

• Design a new autostereoscopic video display for multiple viewers.

UC-MIT

Relationship to the rest of the field:

• The holographic video display is the first of its kind, and is unique in its size (75mm x 125 mm) and its capability for rapid interaction.

• The autostereoscopic display is unique in its ability to provide binocular stereo video to multiple viewers in arbitrary locations, without the use of viewing aids such as spectacles.

UC-MIT

Interactive Holographic Video

UC-MIT

Autostereoscopic Display

Multiple viewers(three, so far)Micropolarizer-based spatial multiplexing

UC-MIT

Viewer Tracking in Progress

recognizer finds left eye(s).

video signal to viewer-tracking LCD

UC-MIT

Task: Telesurgery

To integrate elements of the MURI pipeline for visualization in the performance and training of surgery:

• Capture of anatomical data

• Modeling of deformable objects

• Haptic interaction with models

• 3D display of models

UC-MIT

Progress year 1

Developed virtual environment for surgical training:

• Organ models from Visible Human data

• Simple deformable modeling, using 2D meshes of masses-springs-dampers

• Basic instrument interactions, without force feedback: grasping, cutting, stapling, electrocautery

• Commercial laparoscopic interface without force feedback (Immersion Corp.)

UC-MIT

Progress years 2 & 3

• Added haptic capability to surgical simulation:

Custom 4 degree of freedom laparoscopic interface, based on commerical 3 DOF device (Sensable Tech Phantom)

• Non-linear, graded finite-element modeling for real-time performance and good accuracy & scalability.

• Tested environment in surgical training course at UCSF

UC-MIT

Current Simulation:Gallbladder removal

Removal of soft tissue using electrocautery tool

UC-MIT

3DDI: Overview

Project pipeline:

3D capture:Modeling,simulation

Rendering 3D Display

Applications:Tele-surgery

TrainingCollaboration

UC-MIT

Programmatic Evaluation I

• Research on components and system integration successful One example of complete pipeline--from

scanning to display shown (bottle). Multiple examples of integration of two or

more modules -- walkthru, outdoor reflectance modeling, simulation

UC-MIT

Programmatic Evaluation II

• Research on components has not connected well with original applications. The virtual surgery work does not make much use of the technologies developed in the project.

• Propose shift of primary motivating application to urban model capture, visualization and simulation

UC-MIT

Budget Adjustments

• MIT Phase out research on holographic display Continue autostereoscopic development Increase funding of urban modeling

• UCSF Virtual surgery: phase out completely

• UC Berkeley Increase funding for modeling from laser

scanner data

UC-MIT

3DDI: Overview

Project pipeline:

3D capture:Modeling,simulation

Rendering 3D Display

Application:Exterior and InteriorUrban Environments

UC-MIT

Scenario: Rapid Capture of, andTraining in, Urban Environments

• Acquire high-fidelity geometric and photometric models of real environments

• Provide ability to simulate, visualize and physically interact with this environment

• Enhance photorealism with 3D displays

UC-MIT

An Example Sequence of Interactions

• The images in the following sequence obviously appear synthetic; we want to achieve this functionality while maintaining photorealism.

UC-MIT

Flyby of Model of Real Urban Environment• Can be modified by adding virtual buildings

UC-MIT

Seamless Exterior to Interior Transition

• Incorporate geometric, photometric detailto increase photorealism, immersion

UC-MIT

Compiled Proximity, Visibility Information• Increases interactivity, decreases network

traffic among multiple users of model

UC-MIT

Interact with Physical Objects• Increased ability for natural interaction with

running physical simulation

UC-MIT

Directing Behaviors• Construct problem solving contexts for

training

UC-MIT

Research and Engineering Aspects

• Instrumentation

• Exterior capture

• Interior capture

• Real-time Interaction

• Directable Dynamics

• 3D Display

• System Integration

• Representation!

UC-MIT

Progress over the last 12 months• VCSEL array scanner

• Modeling from range and image data: Berkeley

• Modeling from pose cameras: MIT

• Authoring with dynamics

• Real-time Simulation of Physically Realistic Global Deformations

• System integration in Walkthru framework

UC-MIT

Progress on VCSEL array scanner• Design of row-addressable VCSEL array to

provide a scanning source.

• Fabrication of chip prototypes with bonding to silicon.

• Testing and characterization.

UC-MIT

Progress on Modeling: MIT

• Acquisition sensor improvements

• Faster, more accurate spherical imagery

• Improved sub-pixel edge detection

• Automated rotational alignment

• Improved texture, occlusion estimation

• Off-planar relief estimation (Fua, Leclerc)

• Symbolic window extraction (Wang)

• Framework for indoor/outdoor visibility

UC-MIT

Progress on Authoring with Dynamics

• Combine the visibility structure of a model with a model of object dynamics Objects guarantee where they will not be Cull dynamics safely

• Objective: Frame rate depends on number of objects

in view

• Demonstration: A complex world where frame rate (largely)

depends on number of objects in view

UC-MIT

Progress on Real-time Simulation of Physically Realistic Global Deformations

•Combines the best features of several models FEM accuracy (theory of elasticity) No distortion (due to the nonlinear strain) Diagonalized mass matrix (similar to particle

system) Graded mesh size of O(n^2) (comparable to

BEM)

UC-MIT

System Integration in Walkthru Framework

• Create an object-oriented, extensible databasein which various types of models can be stored.

• Develop rendering paradigms by which this DB can be explored by many users simultaneously.

• Create the hooks for the attachment of simulators which may allow “work-on-demand” control.

• Allow all interactions to happen over the Internetbetween different types of computers and OS.

UC-MIT

Walkthru Framework

• Environment to model complex dynamic worlds with user interaction: Cell-based visibility culling: Pre-loading of scene parts -- based on

expected demand, derived from user motion.

• Generic simulation interface; integrated: CFAST: NIST’s Fire Simulator IMPULSE: Rigid Body Dynamics

UC-MIT

Progress Summary

• Built a shared, object-oriented data base.

• Extensions beyond just geometry (SYLIF).

• Tools for model generation from floorplans.

• Used in joint model developments with MIT.

• Integration of scene data from Malik’s group.

top related