parameterized environment maps ziyad hakura, stanford university john snyder, microsoft research jed...

Post on 27-Mar-2015

216 Views

Category:

Documents

2 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Parameterized Environment Maps

Parameterized Environment Maps

Ziyad Hakura, Stanford University

John Snyder, Microsoft Research

Jed Lengyel, Microsoft Research

Static Environment Maps (EMs)Static Environment Maps (EMs)

Generated using standard techniques:•Photograph a physical sphere in an environment•Render six faces of a cube from object center

Generated using standard techniques:•Photograph a physical sphere in an environment•Render six faces of a cube from object center

Ray-Traced vs. Static EMRay-Traced vs. Static EM

Self-reflections are missingSelf-reflections are missing

ParameterizedEnvironment Maps (PEM)

ParameterizedEnvironment Maps (PEM)

EM1 EM2 EM3 EM4 EM5 EM6 EM7 EM8

3-Step Process3-Step Process1) Preprocess: Ray-trace images at each viewpoint

2) Preprocess: Infer environment maps (EMs)

3) Run-time: Blend between 2 nearest EMs

1) Preprocess: Ray-trace images at each viewpoint

2) Preprocess: Infer environment maps (EMs)

3) Run-time: Blend between 2 nearest EMs

EM1 EM2 EM3 EM4 EM5 EM6 EM7 EM8

Environment Map GeometryEnvironment Map Geometry

Eye

N

ReflectionRay

EM Geometry

EM TextureMapping

EM Texture

(u,v)

Why Parameterized Environment Maps?

Why Parameterized Environment Maps?

•Captures view-dependent shading in environment

•Accounts for geometric error due to approximationof environment with simple geometry

•Captures view-dependent shading in environment

•Accounts for geometric error due to approximationof environment with simple geometry

How to Parameterize the Space?How to Parameterize the Space?

•Experimental setup•1D view space•1˚ separation between views•100 sampled viewpoints

•In general, author specifies parameters•Space can be 1D, 2D or more•Viewpoint, light changes, object motions

•Experimental setup•1D view space•1˚ separation between views•100 sampled viewpoints

•In general, author specifies parameters•Space can be 1D, 2D or more•Viewpoint, light changes, object motions

Ray-Traced vs. PEMRay-Traced vs. PEM

Closely match local reflections like self-reflectionsClosely match local reflections like self-reflections

Movement Away from Viewpoint Samples

Movement Away from Viewpoint Samples

Ray-TracedRay-Traced PEMPEM

Previous WorkPrevious Work

•Reflections on Planar Surfaces [Diefenbach96]

•Reflections on Curved Surfaces [Ofek98]

•Image-Based Rendering Methods•Light Field, Lumigraph, Surface Light Field, LDIs

•Decoupling of Geometry and Illumination •Cabral99, Heidrich99

•Parameterized Texture Maps [Hakura00]

•Reflections on Planar Surfaces [Diefenbach96]

•Reflections on Curved Surfaces [Ofek98]

•Image-Based Rendering Methods•Light Field, Lumigraph, Surface Light Field, LDIs

•Decoupling of Geometry and Illumination •Cabral99, Heidrich99

•Parameterized Texture Maps [Hakura00]

Surface Light Fields [Miller98,Wood00] Surface Light Fields [Miller98,Wood00]

Surface Light Field Surface Light Field

Dense sampling over surface points of

low-resolution lumispheres

Dense sampling over surface points of

low-resolution lumispheres

PEMPEM

Sparse sampling over viewpoints of

high-resolution EMs

Sparse sampling over viewpoints of

high-resolution EMs

Parameterized Texture Maps [Hakura00]Parameterized Texture Maps [Hakura00]

p1

p2

U

V

p1

p2

U

V

Ligh

tLi

ght

ViewView

Captures realistic pre-rendered shading effectsCaptures realistic pre-rendered shading effects

Comparison withParameterized Texture Maps

Comparison withParameterized Texture Maps

•Parameterized Texture Maps [Hakura00]•Static texture coordinates•Pasted-on look away from sampled views

•Parameterized Environment Maps•Bounce rays off, intersect simple geometry•Layered maps for local and distant environment•Better quality away from sampled views

•Parameterized Texture Maps [Hakura00]•Static texture coordinates•Pasted-on look away from sampled views

•Parameterized Environment Maps•Bounce rays off, intersect simple geometry•Layered maps for local and distant environment•Better quality away from sampled views

EM RepresentationsEM Representations•EM Geometry

•How reflected environment is approximated•Examples:

•Sphere at infinity•Finite cubes, spheres, and ellipsoids

•EM Mapping•How geometry is represented in a 2D map•Examples:

•Gazing ball (OpenGL) mapping•Cubic mapping

•EM Geometry•How reflected environment is approximated•Examples:

•Sphere at infinity•Finite cubes, spheres, and ellipsoids

•EM Mapping•How geometry is represented in a 2D map•Examples:

•Gazing ball (OpenGL) mapping•Cubic mapping

Layered EMsLayered EMs

reflector

local EM

distant EM

Segment environment into local and distant maps•Allows different EM geometries in each layer•Supports parallax between layers

Segment environment into local and distant maps•Allows different EM geometries in each layer•Supports parallax between layers

Segmented, Ray-Traced ImagesSegmented, Ray-Traced Images

DistantDistant Local ColorLocal Color Local Alpha Local Alpha Fresnel Fresnel

EMs are inferred for each layer separatelyEMs are inferred for each layer separately

Distant LayerDistant Layer

reflector

distant EM

N

Eye

R

Ray directly reaches distant environmentRay directly reaches distant environment

Distant LayerDistant Layer

reflector

distant EM

N

Eye

R

Ray bounces more times off reflectorRay bounces more times off reflector

Distant LayerDistant Layer

reflector

distant EM

N

Eye

R

Ray propagated through reflectorRay propagated through reflector

Local LayerLocal Layer

Local ColorLocal Color Local AlphaLocal Alpha

Fresnel LayerFresnel Layer

Fresnel modulation is generated at run-timeFresnel modulation is generated at run-time

EM InferenceEM Inference

A x = bA x = bUnknown

EM TexelsUnknown

EM TexelsRay-Traced

ImageRay-Traced

ImageHW Filter

CoefficientsHW Filter

Coefficients

HardwareRender

HardwareRender

ScreenScreenEM TextureEM Texture

Inferred EMs per ViewpointInferred EMs per ViewpointDistantDistant Local

ColorLocalColor

Local Alpha Local Alpha

Run-TimeRun-Time•“Over” blending mode to composite local/distant layers

•Fresnel modulation, F, generated on-the-fly per vertex

•Blend between neighboring viewpoint EMs

•Teapot object requires 5 texture map accesses:2 EMs (local/distant layers) at each of2 viewpoints (for smooth interpolation) and1 1D Fresnel map (for better polynomial interpolation)

•“Over” blending mode to composite local/distant layers

•Fresnel modulation, F, generated on-the-fly per vertex

•Blend between neighboring viewpoint EMs

•Teapot object requires 5 texture map accesses:2 EMs (local/distant layers) at each of2 viewpoints (for smooth interpolation) and1 1D Fresnel map (for better polynomial interpolation)

Frgbrgb DLLL )1( Frgbrgb DLLL )1(

Video ResultsVideo Results

•Experimental setup•1D view space•1˚ separation between views•100 sampled viewpoints

•Experimental setup•1D view space•1˚ separation between views•100 sampled viewpoints

Layered PEM vs. Infinite Sphere PEM

Layered PEM vs. Infinite Sphere PEM

Real-time DemoReal-time Demo

SummarySummary•Parameterized Environment Maps

•Layered•Parameterized by viewpoint•Inferred to match ray-traced imagery

•Accounts for environment’s•Geometry•View-dependent shading

•Mirror-like, local reflections

•Hardware-accelerated display

•Parameterized Environment Maps•Layered•Parameterized by viewpoint•Inferred to match ray-traced imagery

•Accounts for environment’s•Geometry•View-dependent shading

•Mirror-like, local reflections

•Hardware-accelerated display

Future WorkFuture Work

•Placement/partitioning of multiple environment shells

•Automatic selection of EM geometry

•Incomplete imaging of environment “off the manifold”

•Refractive objects

•Glossy surfaces

•Placement/partitioning of multiple environment shells

•Automatic selection of EM geometry

•Incomplete imaging of environment “off the manifold”

•Refractive objects

•Glossy surfaces

QuestionsQuestions

Timing ResultsTiming Results   On the

ManifoldOn the

ManifoldOff the

ManifoldOff the

Manifold

22 33

texgen timetexgen time 35ms35ms 35ms35ms

frame timeframe time 45ms45ms 57ms57ms

FPSFPS 2222 17.517.5

#geometry passes

#geometry passes

Texel Impulse ResponseTexel Impulse Response

To measure the hardware impulse response, render with a single texel set to 1.To measure the hardware impulse response, render with a single texel set to 1.

HardwareRender

HardwareRender

ScreenScreenTextureTexture

Single Texel ResponseSingle Texel Response

0

1

1

1

k

s

H s

s

0

1

1

1

k

s

H s

s

Model for Single TexelModel for Single Texel

0 0

1 1

1 1

1

k k

s s

s s

s s

x bA

0 0

1 1

1 1

1

k k

s s

s s

s s

x bA

one column per texelone column per texel

one row per screen pixelone row per screen pixel

Model for MIPMAPsModel for MIPMAPs00,00,0

00,10

1, 1

10,0

1

1, 12 2

1, 1

filter coefficients for levelfilter coefficients for

level

filter coefficients for

u v

u v

m n

Axs

sx

x

x

s

1 1

0,0

0,1

1

10,0

-11

1, 12 2 1, 1

level

l l

l

llu v

m n

xb

s

s

x

xs

00,00,0

00,10

1, 1

10,0

1

1, 12 2

1, 1

filter coefficients for levelfilter coefficients for

level

filter coefficients for

u v

u v

m n

Axs

sx

x

x

s

1 1

0,0

0,1

1

10,0

-11

1, 12 2 1, 1

level

l l

l

llu v

m n

xb

s

s

x

xs

ConclusionConclusion

PEMs provide: •faithful approximation to ray-traced

images at pre-rendered viewpoint samples

•plausible movement away from those samplesusing real-time graphics hardware

PEMs provide: •faithful approximation to ray-traced

images at pre-rendered viewpoint samples

•plausible movement away from those samplesusing real-time graphics hardware

PEM vs. Static EMPEM vs. Static EM

top related