NYU’s Movement Laboratory has reconstructed Yankee closer Mariano Rivera’s pitching motion to offer an animated three-dimensional look at how he appears before hitters.

NYU’s Movement Lab Reconstructs Mariano Rivera’s Pitching Motion for Animated 3D Look at His Delivery
NYU’s Movement Laboratory has reconstructed Yankee closer Mariano Rivera’s pitching motion to offer an animated three-dimensional look at how he appears before hitters. The video is part of an online feature, “Mariano Rivera, King of Closers,” at NYTimes.com. Image courtesy of The New York Times.

New York University’s Movement Laboratory has reconstructed Yankee closer Mariano Rivera’s pitching motion to offer an animated three-dimensional look at how he appears before hitters. The video is part of an online feature, “Mariano Rivera, King of Closers,” at NYTimes.com. The Yankee pitcher is also featured on the cover of this Sunday’s New York Times Magazine.

The Movement Laboratory, part of NYU’s Courant Institute of Mathematical Sciences, was given videos of Rivera pitching by The Times for analysis. The lab’s computer scientists then reconstructed the images to offer an animated three-dimensional view of his pitching motion.

The work on the Rivera video is the latest in a series of projects conducted by the Movement Laboratory, which is headed by Computer Science Professor Chris Bregler, on motion capture.

Motion capture records movements of individuals, who wear suits that reflect light to enable the recording of their actions. It then translates these movements into digital models for 3D animation often used in video games and movies, such as “The Polar Express” and “Iron Man.”  The more sophisticated computer-vision technology, by contrast, allows for the tracking and recording of these movements straight from video and without the use of motion capture suits.

Bregler and his colleagues have already developed a method to identify and compare the body language of different speakers—a trait they call “body signatures.” Titled “GreenDot,” the project employs motion capture, pattern recognition, and “Intrinsic Biometrics” techniques. In 2008, their results showed that actress Tina Fey, who was widely praised for imitating Republican Vice-Presidential nominee Sarah Palin’s voice and appearance, also effectively channeled the Alaska governor’s body language.

The research team—the NYU Movement Group—has also recently developed computer vision techniques that enable the researchers to capture and analyze large amounts of YouTube videos and television broadcasts. In collaboration with Peggy Hackney, a movement expert and a faculty member in the Department of Theater, Dance, and Performance Studies at the University of California, Berkeley, the research group has designed a new system that can automatically classify different motion style categories and find similarities and dissimilarities among body signatures.

For more about the Movement Lab’s motion capture work, click here.

 

Press Contact

James Devitt
James Devitt
(212) 998-6808