motion capture history, technologies and applicationsvberezin/classes/mocap/motioncapture... · •...
TRANSCRIPT
Motion Capture History, Technologies and ApplicationsADVANCED COMPUTING CENTER FOR THE ARTS AND DESIGN
OHIO STATE UNIVERSITY
VITA BEREZINA-BLACKBURN, ANIMATION AND MOCAP SPECIALIST
©2003- 2016, THE OHIO STATE UNIVERSITY
Motion Capture• motion capture (mocap) is sampling and recording motion of humans,
animals and inanimate objects as 3d data for analysis, playback and remapping
• performance capture is acting with motion capture in film and games• motion tracking is real-time processing of motion capture data
©2003- 2016, THE OHIO STATE UNIVERSITY
History of Motion Capture• Eadweard Muybridge (1830-1904)
• Etienne-Jules Marey (1830-1904)
• Nikolai Bernstein (1896-1966)
• Harold Edgerton (1903-1990)
• Gunnar Johansson (1911- 1998)
©2003- 2016, THE OHIO STATE UNIVERSITY
Eadward Muybridge
• photographer• the flying horse• UK-USA, 1872• 20,000 photos of animal and human
locomotion
©2003- 2016, THE OHIO STATE UNIVERSITY
© Kingston Museum
Etienne-Jules Marey
• physiologist, researcher of movement and human anatomy
• first person to analyze human and animal motion with film
• introduced black suit and reflective markers to track movement of joints
• France, 1880s
©2003- 2016, THE OHIO STATE UNIVERSITY
Etienne-Jules Marey
• created chronophotographic gun• fixed plate camera
©2003- 2016, THE OHIO STATE UNIVERSITY
Modern Art
• Futurism (Boccionni, Balla and others)
• Marcel Duchamp
©2003- 2016, THE OHIO STATE UNIVERSITY
Nikolai Bernstein
• General Biomechanics – 1924, Central Institute of Labor, Moscow
• physiology of sport and labor activities, foundations of ergonomics
• cyclography• concepts of degrees of freedom and
hierarchical structure of motion control • https://www.youtube.com/watch?v=sKWofhlZc
mA
©2003- 2016, THE OHIO STATE UNIVERSITY
Harold Edgerton
• electronic stroboscope and flash• exposures of 1/1000th to 1/1000000 sec• MIT, 1930s-1960
©2003- 2016, THE OHIO STATE UNIVERSITY
© Palm Press Inc.
GUNNAR JOHANSSON
• visual perception of biological motion, experimental psychology, 1970s, University of Uppsala, Sweden
• body motion can be recognized from motion patterns that present only moving dots at the positions of the joints of moving humans
• retro-reflective patches on joints • video recording instead of film, search light
mounted very closely to the camera lens, light reflects from patches into the lens
• computer modeling of motion variations
©2003- 2016, THE OHIO STATE UNIVERSITY
Rotoscoping aka Manual Motion Remapping
©2003- 2016, THE OHIO STATE UNIVERSITY
© Walt Disney
• allowed animators to trace cartoon characters over photographed frames of live performances
• invented in 1915 by Max Fleischer
• Koko the Clown• Snow White• Pinocchio
1980’s Computer Graphics
• military • medical research • first production use
o Brilliance by Robert Abel , brute force animation technique(1985 Superbowl ad)
o Waldo C. Graphic (1988) PDI for Jim Henson tour
o Mike the Talking Head (Siggraph 88)
o Don’’t Touch Me (1989)
©2003- 2016, THE OHIO STATE UNIVERSITY
ACTIVE
•electromechanical•optical fiber•optical: strobing LEDs (visible and infrared, marker based)•acoustic•inertial•optical markerless based on structured light•optical markerless based on video•computer vision markers
PASSIVE
•optical: retroreflective markers•acoustic/echolocation •infrared, markerless
Types Mocap Technologies
©2003- 2016, THE OHIO STATE UNIVERSITY
Marker based optical motion capture systems
• light weight, variable size, retro-reflective markers
• requires at least two cameras or model assumption if one camera
• VGA to16 megapixel resolution cameras with strobing LEDs digitize different views of performance
• up to 5000fps• under 1mm accuracy
• marker occlusion• capture volume limits
VICONNATURAL POINTMOTION ANALYSISQUALISYS
©2003- 2016, THE OHIO STATE UNIVERSITY
©2003- 2016, THE OHIO STATE UNIVERSITY
© Capture Lab18,000 sq ft capture volume with 132 cameras at The Capture Lab (EA)
Strobing LED marker system
©2003- 2016, THE OHIO STATE UNIVERSITY
• red or infrared LEDs
• unique strobing frequency for each marker
• no marker swapping: faster data processing
• limited volume• marker occlusion• limited capture time due to
battery life for LED• wires running up and down
capture subject
PHASESPACE
Marker based optical motion capture systems
• computer vision markers• unlimited tracking volume• location agnostic• unique marker IDs • proprietary at ILM and Manhattan Mocap
©2003- 2016, THE OHIO STATE UNIVERSITY
Electromechanical Capture
• linked structures• potentiometers determine degree of rotation
for each link • no occlusion• no magnetic or electrical interference• unlimited capture volume• low cost
• no global translation• restricted movement• fixed configuration of sensors• low sampling rate• inaccurate joints
GYPSY MOCAP SYSTEM
©2003- 2016, THE OHIO STATE UNIVERSITY
Inertial systems
• inertial trackers placed on joints• measures orientation and position with
accelerometers, gyroscopes, magnetometers on each segment
• unlimited capture volume• UWB RF for position tracking• no occlusion, multiple subjects• low cost• positional drift• translational data needs to be collected
separately• battery packs and wires on the
performer’s body.
©2003- 2016, THE OHIO STATE UNIVERSITY
XSENS MOCAP SYSTEMPERCEPTION NEURON
Electromagnetic systems
• electromagnetic sensors placed on joints or other critical points
• measures orientation and position of sensor relative to electromagnetic field generated by the transmitter
• no sight line requirements• no occlusion, multiple subjects
• electromagnetic interference, small volume if body translation tracking is needed
©2003- 2016, THE OHIO STATE UNIVERSITY
ASCENSION-TECHNORTHERN DIGITALPOLHEMUS
Optical fiber system
• fiber-optic sensor • bend and twist sensors measure transmitted
light• no occlusion• flexible capture volume
• adjustment to individual proportions is limited• less accurate data
©2003- 2016, THE OHIO STATE UNIVERSITY
CYBERGLOVEDATAGLOVE
Acoustic/ultrasonic systems
• set of transducers/transcievers generate and evaluate high frequency sound wave
• other sounds in frequency range can disrupt capture
• accuracy not as high as other systems
©2003- 2016, THE OHIO STATE UNIVERSITY
INTERSENSEnexonar
Markerless Motion Capture
©2003- 2016, THE OHIO STATE UNIVERSITY
Max Plank Institute research (3d scanner + silhouette analysis from video)Max Plank Institute for Intelligent and Perceiving Systems
realtime 3d model reconstruction and background extraction realtime remapping
©2003- 2016, THE OHIO STATE UNIVERSITY
Markerless Motion Capture: Full Body
ORGANIC MOTIONCAPTURYKINECT
©2003- 2016, THE OHIO STATE UNIVERSITY
Leap Sensor
Markerless Motion Capture: Hands
Markerless Motion Capture: Face
©2003- 2016, THE OHIO STATE UNIVERSITY
FACS/Paul Ekman
Original R&D:Digital Emily ProjectFaceware
Medusa (Disney Zurich)
RGB-d based:
Faceshift
Typical Motion Capture Pipeline
• planning (performers and actions, props, space requirements)
• recording point data(Vicon Blade)
• data processing, realtime or post standard skeletal solving (Vicon Blade, MotionBuilder, Ikinema )
• custom skeleton creation inside the target 3d model (3d animation software)
• remapping standard skeletal motion to customized characters (MotionBuilder or Ikinema)
©2003- 2016, THE OHIO STATE UNIVERSITY
Planning a Capture Session
• shot list• performance space dimensions• interactions in shot• shots to be blended or looped • length of shots• size and location of props• gross proportional differences for retargeting• camera motion
©2003- 2016, THE OHIO STATE UNIVERSITY
More Planning
• Character/Prop setupo target skeleton/character topologyo ready stance considerations (T-pose)
• Marker setup (optical marker-based systems)o marker redundancy o three markers per segmento place markers close to boneo asymmetryo recognizable configurationo anticipation of occlusion
• output format• file naming conventions• frame rate• target software platform• database management• potential technical issues: ie charged batteries
©2003- 2016, THE OHIO STATE UNIVERSITY
Virtual Production
• Pioneered for the production of James Cameron’s “Avatar”
• virtual camera• simulcam
©2003- 2016, THE OHIO STATE UNIVERSITY
Virtual Reality
• trackable head mounted display• mobile device with gyroscope, accelerometer and GPS• trackable hands and more
©2003- 2016, THE OHIO STATE UNIVERSITY
Feature Films, Games and VR applications
• Avatar• Dawn of the Planet of the Apes• The Force Awakens• Curious Case of Benjamin Button• Hellblade, making of realistic facial performance• EA Sports football capture session• EA Sports soccer capture session• ILMxLab• PrioVR• Sixense• VR news
©2003- 2016, THE OHIO STATE UNIVERSITY
Applications
• BIOMEDICAL AND PHYSICAL THERAPYMixed Reality RehabilitationACCAD Students at Nationwide Childrens’ HospitalTongue Capture for Speech TherapyVR and Robotics for Paraplegics
©2003- 2016, THE OHIO STATE UNIVERSITY
Applications
• HISTORICAL MOVEMENT PRESERVATION
Native American Performance
• ARTSOpen Ended GroupVirtual Actors in Chinese OperaACCAD Motion LabDeakin University Motion Lab ProjectsRobotic Camera Choreography via Mocap
©2003- 2016, THE OHIO STATE UNIVERSITY
• LIFE SCIENCES• ENGINEERING
• MILITARY AND LAW ENFORCEMENTVR weapon training with acoustic tracking systemVirtual Crime Scene Simulation
• Sports Golf Training SimulatorGPSports - GPS sensors and other motion tracking tech for coaches
Motion Capture Research Areas
©2003- 2016, THE OHIO STATE UNIVERSITY
Research Areas
o crowd captureo capture in crowds and complex environmentso markerless mocapo hand/finger captureo VR mocap integration
Selected Motion Technology and Integration Researchers
• Max Plank Institute for Intelligent and Perceiving Systems
• Univesity of South California (Paul Debevec)• Chris Bregler (NYU, Stanford, Google)• Carnegie Mellon (Jessica HodginsCarnegie
Mellon (Jessica Hodgins)• Max Planck Center (Christian Theobalt)• Stanford (Vladlen Koltun)• Synlab at Georgia Tech (Ali Mazalek)
©2003- 2016, THE OHIO STATE UNIVERSITY
©2003- 2016, THE OHIO STATE UNIVERSITY
Mocap and VR Studios
Giant StudiosCapture LabHouse of MovesImaginarium StudiosJim Henson Digital Studio
ILMxLabOculus Story StudioSteamVR
Ninja Theory
References
1. Menache, Alberto, “Understanding Motion Capture for Computer Animation and Video Games”2. http://www.kingston.ac.uk/Muybridge/3. http://www.anotherscene.com/cinema/firsts/marey.html4. http://cmp1.ucr.edu/exhibitions/edgerton/edgerton.html5. http://www.metamotion.com6. http://gvv.mpi-inf.mpg.de/files/pami2013/jgall_motioncapture_multiple_pami13.pdf7. http://dl.acm.org/citation.cfm?id=26141768. http://www.utdallas.edu/~xxg061000/tongue.pdf9. http://en.wikipedia.org/wiki/Nikolai_Bernstein10. http://masgutovamethod.com/content/overlays/nikolai-bernstein.html11. http://www.theartstory.org/movement-futurism.htm12. Reinhard Klette and Garry Tee. “Understanding Human Motion: A Historic Review”13. Johansson, Gunnar. “Visual perception of biological motion and a model for its analysis”,
Perception & Psychophysics, 1973. Vol. 14. No.2. 201·21114. Salasar Sutil, Nicolas. “Motion and Representation: The Language of Human Movement” MIT
Press.15. http://www.clinicalgaitanalysis.com/history/enlightenment.html16. http://manhattanmocap.com/olympics201217. http://www.capturelab.com/
©2003- 2016, THE OHIO STATE UNIVERSITY