Not Your Father’s Electronica
The totally unexpected sounds of 21st Century computer-generated music from NYU’s New Interfaces for Musical Expression.
More than 40,000 years after Homo Sapiens fashioned a crude flute out of vulture bones and mammoth ivory, the human impulse to generate fresh sounds in imaginative ways is alive and well at the Tisch School of the Arts’ Interactive Telecommunications Program (ITP). There grad students design one-of-a-kind digital musical instruments as part of a second-year course called New Interfaces for Musical Expression (NIME). At the end of the term, an off-campus showcase answers the many questions surrounding the enigmatic projects that have been occupying the classroom at 721 Broadway. “We’re pretty cramped up on the 4th floor and the whole semester you see what everybody’s building but you’re not exactly sure what [they] will do,” says Matt Romein (TSOA, ’16), an NIME alum who stayed on for a year to help out as resident researcher. “It becomes this kind of buildup and you want to go to the show and find out.”
The dynamic results are a far cry from the sedentary electronica concerts of yore (or at least the ’70s and ’80s). “Performers basically sat at a table looking into a laptop computer and typing on their keyboard, and that can [be] powerful,” says adjunct professor and NIME instructor Greg Shakar (GAL, ’94; TSOA, ’01). “[But] one main motivation behind the course was to explore ways we can use the tools of computers and modern technology and electronics like musical instruments [and put on] really compelling performances.”
The dynamic results are a far cry from the sedentary electronica concerts of yore (or at least the ’70s and ’80s).
Among the revelatory moments shared by the enthusiastic crowd at Brooklyn art space Littlefield last December was Yuli Cai's “waterFalse.” The installation/performance reimagined an ancient Chinese chime and percussion instrument. By pulling strings attached to repurposed salad spinners, the artist triggered four digitally-recorded sounds—wind, singing monks, matouqin (a two-stringed Mongolian instrument) and female opera vocals—while simultaneously playing acoustic audio of swirling water. Each string also released fog into the air above the stage onto which was projected video of rippling water.
Such resourcefulness is all the more impressive given that most of the performers have never taken the stage or a music lesson before (although they all have at least one year of ITP study under their belts, including Introduction to Physical Computing and Introduction to Computational Media). “The program really values having students with a wide range of backgrounds,” says Shakar. Which means there’s a lot to cover. “We talk very quickly about the elements of music,” he says. “What is rhythm, what is timbre, what is melody, structuring a piece, music theory, and composition. It’s a crash course in everything.”
So what components go into making a fascinating and engaging instrument? “One is the interface, the physical gestures the performer uses to play music,” says Shakar. “Another is the sound it will make. And sometimes those elements can be connected. But ultimately, a successful project conveys a sense of amazement or fascination or even just a particular artistic idea to an audience,” he says. “And I have to say, the students amaze me every semester.”