May 19, 2009
New York University Computer Science Professor Chris Bregler has received a $1.47 million grant from the U.S. Office of Naval Research (ONR) to enhance his laboratorys previous work on motion capture and computer vision.
The goal of the project is to train a computer to recognize a person based on his or her motions and to identify, through motion capture and computer vision, an individuals emotional state, cultural background, and other attributes.
Motion capture records movements of individuals, who wear suits that reflect light to enable the recording of their actions. It then translates these movements into digital models for 3D animation often used in video games and movies, such as The Polar Express and Iron Man. The more sophisticated computer-vision technology, by contrast, allows for the tracking and recording of these movements straight from video and without the use of motion capture suits.
Bregler and his colleagues at NYUs Courant Institute of Mathematical Sciences have already developed a method to identify and compare the body language of different speakers-a trait they call body signatures. Titled GreenDot, the project employs motion capture, pattern recognition, and Intrinsic Biometrics techniques. This fall, their results showed that actress Tina Fey, who was widely praised for imitating Republican Vice-Presidential nominee Sarah Palins voice and appearance, also effectively channeled the Alaska governors body language.
The research team-the NYU Movement Group-has also recently developed computer vision techniques that enable the researchers to capture and analyze large amounts of YouTube videos and television broadcasts. In collaboration with Peggy Hackney, a movement expert and a faculty member in the Department of Theater, Dance, and Performance Studies at the University of California, Berkeley, the research group has designed a new system that can automatically classify different motion style categories and find similarities and dissimilarities among body signatures.
Under the ONR grant, Bregler and his team will seek to bolster their previous work in two areas. They will develop multi-modal sensors in order to capture subtle facial movements, full body motion, and multi-person interactions and they will create a computer infrastructure with the capacity to house a database allowing researchers to data-mine, discover, and model the complex and wide variety of different human activities and styles.
For more about Breglers motion capture work, go to http://movement.nyu.edu/experiments/.
Type: Press Release