New York University Skip to Content Skip to Search Skip to Navigation Skip to Sub Navigation

NYU’s Bregler Receives $1.47 Million Grant to Enhance Motion Capture Tools

May 19, 2009
N-472, 2008-09

New York University Computer Science Professor Chris Bregler has received a $1.47 million grant from the U.S. Office of Naval Research (ONR) to enhance his laboratory’s previous work on motion capture and computer vision.

The goal of the project is to train a computer to recognize a person based on his or her motions and to identify, through motion capture and computer vision, an individual’s emotional state, cultural background, and other attributes.

Motion capture records movements of individuals, who wear suits that reflect light to enable the recording of their actions. It then translates these movements into digital models for 3D animation often used in video games and movies, such as “The Polar Express” and “Iron Man.” The more sophisticated computer-vision technology, by contrast, allows for the tracking and recording of these movements straight from video and without the use of motion capture suits.

Bregler and his colleagues at NYU’s Courant Institute of Mathematical Sciences have already developed a method to identify and compare the body language of different speakers-a trait they call “body signatures”. Titled “GreenDot,” the project employs motion capture, pattern recognition, and “Intrinsic Biometrics” techniques. This fall, their results showed that actress Tina Fey, who was widely praised for imitating Republican Vice-Presidential nominee Sarah Palin’s voice and appearance, also effectively channeled the Alaska governor’s body language.

The research team-the NYU Movement Group-has also recently developed computer vision techniques that enable the researchers to capture and analyze large amounts of YouTube videos and television broadcasts. In collaboration with Peggy Hackney, a movement expert and a faculty member in the Department of Theater, Dance, and Performance Studies at the University of California, Berkeley, the research group has designed a new system that can automatically classify different motion style categories and find similarities and dissimilarities among body signatures.

Under the ONR grant, Bregler and his team will seek to bolster their previous work in two areas. They will develop multi-modal sensors in order to capture subtle facial movements, full body motion, and multi-person interactions and they will create a computer infrastructure with the capacity to house a database allowing researchers to data-mine, discover, and model the complex and wide variety of different human activities and styles.

For more about Bregler’s motion capture work, go to

This Press Release is in the following Topics:
Graduate School of Arts and Science, Research

Type: Press Release

NYU researchers in motion capture suits.

NYU researchers in motion capture suits.

Search News

NYU In the News

Entrepreneurship Lab Opens at NYU

Crain’s New York Business covered the opening of the Mark and Debra Leslie Entrepreneurial eLab, which will be the headquarters for NYU’s Entrepreneurial Institute and all of the University’s programs aimed at promoting innovation and startups.

A Globalizer for N.Y.U. in Abu Dhabi

The New York Times profiled Bill Bragin who will become the first executive artistic director of NYU Abu Dhabi’s new performing arts center.

Think Tank to Ponder a Future for Ballet

The New York Times profiled Jennifer Homans, the director of NYU’s new Center for Ballet and the Arts.

The Brilliant Ten: Jonathan Viventi Builds Devices That Decode Thoughts

Popular Science named Assistant Bioengineering Professor Jonathan Viventi as one of its “brilliant ten” for his research into brain implants that could one day halt epileptic episodes:

Living and Leaving the Dream: Adrian Cardenas’ Journey from the Major Leagues to College

The New York Times ran a feature on Adrian Cardenas, a former major league baseball player who is now studying philosophy and creating writing at NYU.

NYU Footer