New York University’s Movement Laboratory has reconstructed the motions of Alan Gilbert, music director of the New York Philharmonic, to offer an animated, three-dimensional look at the actions of a conductor.
The video is part of an online feature, “Demystifying Conducting: A Connection Between Gesture and the Music,” by principal producers Graham Roberts and Xaquín G.V., graphics editors at The New York Times, at NYTimes.com and an article, “The Maestro’s Mojo,” by The Times’ Daniel Wakin that unpacks the nature of conducting. The story will appear in the April 8 “Arts & Leisure” section.
“Demystifying Conducting: A Connection Between Gesture and the Music” may be viewed here.
The work by NYU’s Movement Laboratory isolates Gilbert’s motions, free from the sounds of the orchestra, to give viewers a sense of the complexity of a conductor’s actions while leading musicians during a rehearsal. For more on the laboratory’s work on the project, click here.
The Movement Laboratory, part of NYU’s Courant Institute of Mathematical Sciences, recorded Gilbert during the orchestra’s rehearsals, then the lab’s computer scientists reconstructed the images to offer an animated, three-dimensional view of his motions.
“Demystifying Conducting” is the latest in a series of projects by the Movement Laboratory, which is headed by Computer Science Professor Chris Bregler, on motion capture. In 2010, it recreated Yankee closer Mariano Rivera’s pitching motion to offer a three-dimensional look at how he appears before hitters.
Motion capture records movements of individuals, who wear suits that reflect light to enable the recording of their actions. It then translates these movements into digital models for 3D animation often used in video games and movies, such as “The Polar Express” and “Iron Man”. The more sophisticated computer-vision technology, by contrast, allows for the tracking and recording of these movements straight from video and without the use of motion capture suits.
Bregler and his colleagues have already developed a method to identify and compare the body language of different speakers—a trait they call “body signatures.” Titled “GreenDot,” the project employs motion capture, pattern recognition, and “Intrinsic Biometrics” techniques. In 2008, their results showed that actress Tina Fey, who was widely praised for imitating Republican Vice-Presidential nominee Sarah Palin’s voice and appearance, also effectively channeled the Alaska governor’s body language.
The research team—the NYU Movement Group—has also developed computer vision techniques that enable the researchers to capture and analyze large amounts of YouTube videos and television broadcasts. In collaboration with Peggy Hackney, a movement expert and a faculty member in the Department of Theater, Dance, and Performance Studies at the University of California, Berkeley, the research group has designed a new system that can automatically classify different motion style categories and find similarities and dissimilarities among body signatures.
For more about the Movement Lab’s motion capture work, click here.