NYU Holodeck: One Step Closer to Star Trek Tech

By Victoria Lubas | November 16, 2020

abstract rendering of motion tracked musicians

NYU IT's Robert Pahle and Stratos Efstathiadis Discuss the University's Ambitious Holodeck Project

With technology’s rapidly evolving pace, we’re getting closer and closer to a future—or at least a piece of it—envisioned by Star Trek: The Next Generation, with digital assistants like Siri and Cortana and tele-health doctor’s appointments from the comfort of our living rooms. Long before the coronavirus pandemic, the NSF Major Research Instrumentation grant (Award 1626098), the NYU Holodeck team was hard at work innovating, with the goal of creating “an instrument and educational environment that facilitates enticing research and creates an experiential supercomputing infrastructure…” that further changes our understanding of place and time. 

This interdisciplinary project combines the research strengths of several NYU faculty—including Ken Perlin of the Courant School of Mathematical Science’s Future Reality Lab (FRL) as lead, Luke Dubois of NYU’s Media and Games Network (MAGNET), Claudio Silva of the Visualization Imaging and Data Analysis Center (VIDA) in Brooklyn, Jan Plass of Steinhardt’s Consortium for Research and Evaluation of Advanced Technologies in Education (CREATE) and Agnieszka Roginska of the Music Audio Research Laboratory (MARL)¹—as well as Rob Pahle, Stratos Efstathiadis and Jeremy Rowe of NYU IT Research Technology, and partners at the University of Arizona. Robert Pahle, NYU IT Research Technology Senior Research Scientist, explained that “the new High-Speed Research Network is a crucial component for the Holodeck team to succeed,” and that “all [these] components together produce an instrument [that] can mix and match components for research.” 

Technology That Advances Artistic Pursuits

Pahle’s primary focus for this project is infrastructure-based, as he emphasizes the importance of combining seemingly-unrelated fields by advancing software and NYU infrastructure. One potential of the Holodeck Pahle described focuses on overcoming distance as a barrier by “combining distributed nodes and locations" to create concerts in which all the performers play together live from many different geographic locations. The challenge is to overcome a latency issue akin to buffering. When the musicians play, differences in connectivity and speed result in mistimed notes that call attention to the distance between the performers and ruins the sense of synchronicity. The Holodeck team has developed software that addresses these issues, and creates a connection in real-time that functions like a more-precise version of existing video chat technologies but with integrated audio, video, and motion tracking capabilities.

From there, the Holodeck infrastructure creates a high-definition, surrounding sense of sound to realistically simulate the experience of performers being in the room with you. To give an example of the applications of this technology, Pahle described a project that Agnieszka Roginska's group developed: a person wearing a VR headset and sitting in a room with three other musicians and an empty chair. The VR headset displays four virtual musicians—the three ones in the room and one remote musician that plays at a different location. The observer wears an open ear headset and can hear the three local musicians directly. The fourth remote musician's sound gets superimposed via the headset. The experience is a harmonic play of all four musicians where the fourth musician cannot be distinguished from the other three. Just as the sound of a real musician would get louder if the headset-wearer leaned closer to them, the sound of the virtual musician would also get louder if one leaned toward the empty chair. This is possible because the headset is able to sense the wearer’s motion and adjusts the virtual spatial source and volume accordingly. The goal of this sound technology is to be so precise that if the audience member was already wearing their VR headset when the musicians filed in they wouldn’t be able to tell the in-person musician from the remote one.


Pahle explained that “you can combine sound, AR, and VR, to be able to do experiments with Holodeck.” This sound technology that overcomes distance is also applicable to dance. The same latency issue that the Holodeck has to correct in musical performances applies when creating virtual dance performances at a distance. To traverse distance in dance, the performers wear suits made out of special fabrics and covered in motion tracking points. These points create a computerized skeleton that is then matched with the corresponding limbs on an avatar of the dancer. A scalable plug-in uses special software for real-time computing to create the avatar and match movements to the actual dancer..

These elements of music and dance tracking technologies have been combined to create a simultaneous performance across several continents, and allowed musicians in Argentina, Norway, and New York to play together as if on the same stage. Where possible, the chosen music incorporates longer notes to help mask the delay in the music’s transmission, and provides a digital metronome that mediates signals using GPS to account for the delay. The musicians match their timing to the metronome, rather than the music they’re hearing, to result in a seamless performance in New York. Dancers in a New York studio perform in the motion capture suits that are customized to each dancer’s body and tracked by several infrared cameras. The tracked points are converted to avatars using a development platform in Unity and are displayed on a screen behind in-person dancers, resulting in an integrated virtual performance that is created by performers in four different places across three continents.

The Holodeck Efforts in Learning and Education

In addition to the applications of the Holodeck technology to artistic pursuits, this technology can be used to track how people learn in order to create new educational technologies. This could be done by creating a game for high school students that is specifically engineered to help the creators better understand teenagers’ brain functions as they play and learn. Pahle described Plass’ example of this kind of game that requires the player to react quickly to the prompts, such as by quickly feeding red aliens cookies and blue aliens ice cream. The feeding rules randomly switch throughout the game, requiring the player to react accordingly and Plass’ team studies how students learn to accommodate the changing rules.

The Role of Research Technology

The many potential uses of this developing Holodeck technology are made possible by the Research Technology department, which provides the underlying network application layer, computation and storage that provides the central communication infrastructure for the Holodeck. Stratos Efstathiadis, NYU IT’s Director of Research Technology Services, explained that the project is supported financially with the help of NYU and the National Science Foundation MRI Track 2 Development grant¹. The Holodeck project advances due to the significant amounts of time that faculty, staff, students, and external collaborators put into the project. Pahle went on to explain that undergraduate and graduate student employees contribute to the library for programming languages provided the streaming hub, by writing coding languages such as Java, Python, C++, and C Sharp.

The Holodeck project and its potential applications embody a sense of collaboration by joining researchers and experts across fields to seamlessly combine a wide range of research together and create a new multi-modality experiential supercomputer at NYU.

Additional Resources