siggraph 2014 course on computational cameras and displays (part 4)

90
Computational Light Transport Part 3:

Upload: matthew-otoole

Post on 31-May-2015

938 views

Category:

Education


0 download

DESCRIPTION

Recent advances in both computational photography and displays have given rise to a new generation of computational devices. Computational cameras and displays provide a visual experience that goes beyond the capabilities of traditional systems by adding computational power to optics, lights, and sensors. These devices are breaking new ground in the consumer market, including lightfield cameras that redefine our understanding of pictures (Lytro), displays for visualizing 3D/4D content without special eyewear (Nintendo 3DS), motion-sensing devices that use light coded in space or time to detect motion and position (Kinect, Leap Motion), and a movement toward ubiquitous computing with wearable cameras and displays (Google Glass). This short (1.5 hour) course serves as an introduction to the key ideas and an overview of the latest work in computational cameras, displays, and light transport.

TRANSCRIPT

  • 1. Computational Light Transport Part 3:

2. three domains of computation computational cameras computational displays physical world computational light transport 3. Ng et al, 2005 capturing light transport Debevec et al, SIG 2000 image-based rendering & relighting 4. manipulating light transport revealing the invisible 5. manipulating light transport revealing the invisible 6. manipulating light transport revealing the invisible 7. analyzing light transport inferring scene properties 8. outgoing light incoming light using controllable light sources & cameras to sample, acquire or analyze a scenes transport function computational light transport 9. surface scattering volume scattering & absorption mutual illumination shadows caustics refraction mirror reflection defocus blur computational light transport using controllable light sources & cameras to sample, acquire or analyze a scenes transport function 10. i. the light transport matrix ii. example transport matrices i. transport matrix calculations ii. optical computing 11. i. the light transport matrix 12. the superposition principle photo with lights 1 & 2 turned on photo with light 1 turned on photo with light 2 turned on photo taken under two light sources = sum of photos taken under each source individually 13. unknown 3D scene irradiance measurements independent illumination degrees of freedom 14. one point source irradiance measurements (eg. pixels) independent illumination degrees of freedom the light transport matrix Sloan et al 02, Ng et al 03, Seitz et al 05,Sen et al 05, 15. one projected pixel irradiance measurements (eg. pixels) independent illumination degrees of freedom the light transport matrix Sloan et al 02, Ng et al 03, Seitz et al 05,Sen et al 05, 16. irradiance measurements (eg. pixels) across cameras independent illumination degrees of freedom the light transport matrix Sloan et al 02, Ng et al 03, Seitz et al 05,Sen et al 05, 17. projector pixels across projectors irradiance measurements (eg. pixels) across cameras the light transport matrix Sloan et al 02, Ng et al 03, Seitz et al 05,Sen et al 05, 18. independent illumination degrees of freedom irradiance measurements (eg. pixels) across cameras the light transport matrix Sloan et al 02, Ng et al 03, Seitz et al 05,Sen et al 05, 19. transport matrix represents the set of photos under all possible (controllable) lighting conditions the light transport matrix Sloan et al 02, Ng et al 03, Seitz et al 05,Sen et al 05, 20. ii. example transport matrices 21. capturing the transport matrix light transport matrixcamera view of scene 22. capturing the transport matrix light transport matrixcamera view of scene 23. ii. example transport matrices a. direct transport 24. 1 projector pixel turned on convex scene, diffuse reflectance, projector (1 epipolar line) projector pixel # camerapixel#acquiring structured-light 3D scanning stereo disparity map sparse, high rank ambient illumination epipolar line epipolar line epipolar line projector camera camera view 25. 1 point source turned on convex scene, diffuse reflectance, point sources analyzing photometric stereo no shadows [Shashua, PhD 92] attached shadows [Basri & Jacobs, PAMI 01] (for 1 image row) ambient illumination point source # camerapixel# 26. 1 point source turned on convex scene, specular reflectance, point sources specular reflectance can become full rank [Ramamoorthi & Hanrahan, SIG 01] (for 1 image row) ambient illumination analyzing shape-from-specularities [Sanderson et al, PAMI 89] point source # camerapixel# 27. direct transport matrices illumination angular vs. spatial frequency content reflectance angularfrequencycontent dense low rank sparse high rank sparse high rank 28. ii. example transport matrices a. direct transport b. general transport 29. convex scene, diffuse reflectance, projector (1 epipolar line) projector pixel # camerapixel# ambient illumination epipolar line epipolar line epipolar line projector camera camera view 30. convex scene, translucency, projector (1 epipolar line) projector pixel # camerapixel# ambient illumination epipolar line epipolar line epipolar line projector camera camera view 31. convex scene, translucency, projector (1 epipolar line) projector pixel # camerapixel# epipolar line epipolar line epipolar line projector camera camera view 1 projector pixel turned on less sparse, usually high rank recovering structured-light 3D scanning 32. general scene, projector not symmetric beam-splitter 33. general scene, coaxial projector & camera beam-splitter always symmetric 34. iii. transport matrix calculations 35. what operations can we perform on T? column sampling matrix vector product row-space transformation reconstruction max-finding along rows multiplication with transpose inversion 36. approximating T Hadamard multiplexing [Schechner et al 03] no scene priors; optimal SNR for read noise-limited imaging; affected by saturation & poisson noise; brute-force method Kernel Nystrm [Wang et al, SIG 09] low-rank matrix prior; no optimality guarantees; uses impulse imaging compressive sensing [Peers et al 09, Sen & Darabi 09] assumes compressibility (eg. sparse, high rank settings); computationally very intensive for large datasets 37. matrix-vector product new illumination 38. laser-stripe 3D scanning exhaustive search on epipolar line by impulse imaging goal: for each row j, find column containing largest element max-finding along rows epipolar line epipolar line structured-light 3D scanning 39. pattern-based methods localization guaranteed only for rows that are impulse functions goal: for each row j, find column containing largest element max-finding along rows pattern ensemble gray codes min-SW codes De Bruijn Fourier ... Gupta & Nayar, CVPR 2012 40. recent focus: assume rows indirect component has no high frequencies goal: for each row j, find column containing largest element max-finding along rows high-frequency ensemble Gupta & Nayar, CVPR 12 micro-phase shifting [Gupta & Nayar, CVPR 12] XOR-codes [Gupta et al, IJCV 12] 41. multiplication with Ts transpose 42. multiplication with Ts transpose projector emits camera receives p camera emits p projector receives photo of scene from projector viewpoint, under illumination 43. multiplication with Ts transpose captured by camera synthesized Sen et al, SIG 2005 photo of scene from projector viewpoint, under illumination 44. goal: given photo p, find illumination that produces it inversion of T camera defocused projector projection surface Zhang & Nayar, SIG 2006 45. iv. optical computing 46. optical domain computing with light numerical algorithms implemented directly in optics transport matrix illumination vector photo numerical domain 47. computing with light transport matrix illumination vector photo 1. illuminate with 2. capture numerical domain optical domain numerical algorithms implemented directly in optics 48. computing with light numerical domain optical domain numerical algorithms implemented directly in optics 1. illuminate with 2. capture 49. computing with light numerical domain optical domain numerical algorithms implemented directly in optics 50. project capture find an illumination pattern that when projected onto scene, we get the same photo back (multiplied by a scalar) projector camerabeam splitter 51. project capture eigenvector of a square matrix T when projected onto scene, we get the same photo back (multiplied by a scalar) computing transport eigenvectors numerical goal find such that and is maximalprojector camerabeam splitter 52. optical power iteration goal: find principal eigenvector of observation: it is a fixed point of the sequence numerical domain properties linear convergence [Trefethen and Bau 1997] eigenvalues must be distinct cannot be orthogonal to principal eigenvector 53. optical power iteration numerical domain optical domain goal: find principal eigenvector of observation: it is a fixed point of the sequence 54. optical power iteration optical domainnumerical domain projectcapture initialize normalize goal: find principal eigenvector of observation: it is a fixed point of the sequence 55. optical power iteration optical domain projectcapture initialize normalize projector camera beam splitter goal: find principal eigenvector of observation: it is a fixed point of the sequence 56. optical power iteration optical domain projectcapture initialize normalize projector camera beam splitter goal: find principal eigenvector of observation: it is a fixed point of the sequence 57. optical power iteration optical domain projectcapture initialize normalize goal: find principal eigenvector of observation: it is a fixed point of the sequence 58. optical domain projectcapture optical power iteration goal: find principal eigenvector of observation: it is a fixed point of the sequence initialize normalize 59. optical domain projectcapture optical power iteration goal: find principal eigenvector of observation: it is a fixed point of the sequence initialize normalize 60. optical domain projectcapture optical power iteration goal: find principal eigenvector of observation: it is a fixed point of the sequence initialize normalize 61. optical domain (approximate) principal eigenvector optical power iteration goal: find principal eigenvector of observation: it is a fixed point of the sequence 62. numerical goal given photo , find illumination that minimizes inverting light transport remarks low-rank or high-rank unknown & not acquired illumination sequence will be specific to input photo projector camerabeam splitter 63. inverting a low-rank nonsymmetric T 64. flashlight inverting a low-rank nonsymmetric T 65. diffuser inverting a low-rank nonsymmetric T 66. scene inverting a low-rank nonsymmetric T 67. input photo inverting a low-rank nonsymmetric T 68. input photo ? illumination inverting a low-rank nonsymmetric T 69. optical GMRES for inverting a low-rank T 70. conventional photography degrees of freedom 71. transport matrix probing equation probing matrix multiply degrees of freedom 72. transport matrix probing equation probing matrix degrees of freedom 73. transport matrix probing equation degrees of freedom 74. why is probing possible? 75. why is probing possible? 76. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 77. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 78. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 79. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 80. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 81. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 82. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 83. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 84. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 85. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 86. photography by primal-dual coding step 2 illuminate scene with vector . (primal code) step 3 attenuate image with vector . (dual code) step 4 repeat times step 1 open shutter step 5 close shutter 87. photography by primal-dual coding 88. photography by primal-dual coding visit our E-Tech booth on Visualizing Light Transport 89. three domains of computation computational cameras computational displays physical world computational light transport See more this week at SIGGRAPH: Displays (Tues. 10:45-12:15, East Building, Hall A) Comp. Sensing and Displays (Tues. 3:45-5:15, East Building, Hall A) Emerging Technologies (West Building, Hall A) 90. Gordon Wetzstein http://web.media.mit.edu/~gordonw MIT Media Lab / Stanford University Computational Cameras and Displays Matthew OToole www.dgp.toronto.edu/~motoole University of Toronto www.dgp.toronto.edu/~motoole/computationalcamerasanddisplays.html