Next Gen Motion Capture: From the Silver Screen to the Stadium and the Streets

chris_bregler

Chris Bregler

(New York University)


Please LOG IN to view the video.

Date: May 28, 2014

Description:

Mermaids and pirates, the Hulk and Iron Man! This talk will describe the behind-the-scenes technology of our match-moving and 3D capture system used in recent movies, including The Avengers, Pirates of the Caribbean, Avatar, Star Trek, and The Lone Ranger, to create the latest 3D visual effects. It will also show how we have used similar technology for New York Times infographics to demonstrate the body language of presidential debates, the motions of a New York Philharmonics conductor, New York Yankee Mariano Rivera’s pitch style, and Olympic swimmer Dana Vollmer’s famous butterfly stroke that won her four gold medals.

While Motion Capture is the predominant technology used for these domains, we have moved beyond such studio-based technology to do special effects, movement visualization, and recognition without markers and without multiple high-speed IR cameras. Instead, many projects are shot on-site, outdoors, and in challenging environments with the benefit of new interactive computer vision techniques as well as new crowed-sourced and deep learning techniques.

Further Information:

Chris Bregler is a Professor of Computer Science at NYU’s Courant Institute, director of the NYU Movement Lab, and C.E.O. of ManhattanMocap, LLC. He received his M.S. and Ph.D. in Computer Science from U.C. Berkeley and his Diplom from Karlsruhe University. Prior to NYU he was on the faculty at Stanford University and worked for several companies including Hewlett Packard, Interval, Disney Feature Animation, and LucasFilm’s ILM. His motion capture research and commercial projects in science and entertainment have resulted in numerous publications, patents, and awards from the National Science Foundation, Sloan Foundation, Packard Foundation, Electronic Arts, Microsoft, Google, U.S. Navy, U.S. Airforce, and other sources. He has been named Stanford Joyce Faculty Fellow, Terman Fellow, and Sloan Research Fellow. He received the Olympus Prize for achievements in computer vision and pattern recognition and was awarded the IEEE Longuet-Higgins Prize for “Fundamental Contributions in Computer Vision that have withstood the test of time”. Some of his non-academic achievements include being the executive producer of Squidball.net, which required building the world’s largest real-time motion capture volume, and a massive multi-player motion game holding several world records in The Motion Capture Society. He was the chair for the SIGGRAPH Electronic Theater and Animation Festival. He has been active in the Visual Effects industry, for example, as the lead developer of ILM’s Multitrack system that has been used in many feature film productions. His work has also been featured in mainstream media such as the New York Times, Los Angeles Times, Scientific American, National Geographic, WIRED, Business Week, Variety, Hollywood Reporter, ABC, CBS, NBC, CNN, Discovery/Science Channel, and many other outlets.




Created: Wednesday, May 28th, 2014