Get Access to Print and Digital for $23.99 per year.
Subscribe for Full Access
[Postcard]

Virtualsity

Adjust

Inventing the instruments of the future

Roger Linn with the Linnstrument © Roger Linn Design

Roger Linn with the Linnstrument © Roger Linn Design

One of the more unlikely solos in music history was recorded live on July 16, 1986, during a performance in Tokyo by the fusion-jazz supergroup Steps Ahead. The footage is worth watching, even if fusion isn’t your thing. Saxophonist Michael Brecker stands center stage, not with a sax but with an oboe-like instrument that looks as if it had been plucked from a Cubist painting. His fingers rest on silver buttons attached to a skinny black box with no discernible wind passage, and in his mouth is a stem wrapped in a rainbow sports band. Brecker performs a series of ambient tones — from chimes to otherworldly trumpets to squalls of white noise — that to modern ears harks back to the Nintendo era. But the instrument sings, the mood building to a tension that is released only when he triggers a drum machine and starts jamming over the beat.

Brecker is wailing on a prototype of an electronic wind instrument, or EWI (pronounced EE-wee), the first virtual instrument that allowed woodwind players to transfer their technical skills to the digital world. The EWI’s inventor, Nyle Steiner, had begun tinkering with a brass-style electronic synthesizer in the 1960s, unveiling in 1975 the world’s first playable “electric valve instrument,” the Steiner Horn. The EVI garnered some buzz, but it was the EWI, officially released in 1987, that turned heads. It paired a fingering system comprised of sensors laid out like clarinet or saxophone keys with a highly sensitive mouthpiece that registered changes in wind pressure. With the EWI, a woodwind player could vastly expand his or her sonic palette without having to learn entirely new skills. Fusionists like Brecker began adopting it, and it seemed possible that many others would follow suit.

A few years before, however, a powerful new musical language for digital instruments had taken shape. MIDI allowed notes to be programmed in advance and multiple synthetic instruments to communicate simultaneously. “Before MIDI,” writes Jaron Lanier in his curmudgeonly manifesto, You Are Not a Gadget, “a musical note was a bottomless idea that transcended absolute definition . . . after MIDI, a musical note was no longer just an idea but a rigid, mandatory structure.” Synthesized arpeggios and drum-machine sequencers could now be matched up perfectly, and the technical precision this allowed immediately birthed a musical genre, New Wave — without MIDI, there would have been no a-ha, Depeche Mode, or Human League. Over time, automated syncopation came to dominate popular music, separating the physicality of a musicians from the music they played.

Brecker’s solo seems novel today because it heralded a future of expressive electronic instruments that never arrived. But with digital culture now pervasive, instrument designers have been working to create new market for “expressive controllers,” MIDI instruments capable of both dynamic and polyphonic playability. Among the most acclaimed of these creators is Roger Linn, a Grammy winner for “outstanding technical significance” who is beloved of musicians and producers, and who has been responsible for many of the most distinctive sounds in contemporary music.

You’re unlikely to have heard of Linn’s instruments, but you’ve certainly heard them. The famous drumbeat opening of “Billie Jean” comes from his LinnDrum, as do the scurrying sixteenth notes that lead both Prince’s “I Would Die 4 U” and the floor-on-the-floor thump of Human League’s “Don’t You Want Me Baby?” When New Wave bowed to the rise of glam metal and eventually to grunge (i.e., to real drummers), Linn’s company went out of business. But he was soon hired by the Japanese company Akai to develop its line of Musical Production Centers (MPCs), machines that allow users to string together samples of their own creation. It’s barely an exaggeration to say that the MPC ushered in modern hip-hop.

In 2002, Linn returned to manufacturing his own products under the banner Roger Linn Design. He is now putting the finishing touches on his latest creation, the LinnStrument, an expressive controller modeled on the chromatic grid system of a guitar. With its rectangular silicone surface and wood borders, the LinnStrument looks like a tiny ice rink. The surface is divided into touch-sensitive finger-sized squares, with multicolored LED lights underneath that indicate the location of notes within a specific scale. As with a guitar, fingering patterns and chord shapes are the same regardless of key. Players bend notes or add vibrato by moving a finger from side to side, and alter the volume by applying greater or lesser pressure. Unlike a guitar, however, the LinnStrument has no frets, allowing for seamless movement between notes in the style of a violin or cello. And unlike an acoustic device, the LinnStrument is capable of pitching any note along a full spread of two octaves, via sounds patched into it from a computer.

But although there are plenty of videos online showing demonstrations of various expressive controllers, actual live performances with them remain scarce, in large part because professional musicians have been hesitant to adopt them. “I don’t think competition from each other is our biggest enemy,” Linn told me, referring to other designers, “but rather the general resistance to new ideas in musical instruments.” To that end, he and his colleagues have been banding together to present their creations at conferences and TED Talks, positioning themselves as examples of a growing movement to bring virtuosity and individual voice back to their field.

In April, I caught Linn’s demonstration of his controller at Moogfest, a week-long electronic-music conference and festival held in Asheville, North Carolina, in honor of one of that city’s adopted sons, Robert Moog, creator of the modern synthesizer. The town hosts Moog, Inc.’s headquarters, a tiny two-floor factory where twenty-five technicians labor over handmade synthetic instruments. While the techies of the West Coast have the entire Silicon Valley, the synthesists have a small town in the Blue Ridge Mountains.

Linn’s talk took place up the street from Moog, at Asheville’s Masonic Temple. Many instrument sellers had set up shop, and people were trading modular panels and wheeling and dealing vintage synths. When Linn went onstage, shushes quickly brought the room to a standstill. Now in his fifties, Linn talks and dresses like an academic, but within minutes of greeting the audience, he was strapping a glowing LinnStrument over his shoulder, looking as if he had suddenly donned an LED-lit sandwich board. The instrument connected via USB to a MacBook running Logic Pro X, Apple’s frontline music-studio program.

Linn began with a basic synth from the sound library. “All this is is a pulse waveform with a low-pass filter,” he said. “There are no envelope generators and no LFOs.” He placed a middle finger on the board, and a warm tone filled the temple. He added two fingers and massaged the notes into a chord. When he lifted his hand, the sound dwindled away, and the room cheered.   

Then he adjusted a setting on Logic and placed his hands back on the LinnStrument. He recited a few measures of baroque music with a virtual cello, vibrating each finger-square as if it were a taut string, forming a chord with his right hand, and then laying down another with his left. He changed the setting to add some effects, and the cello turned into a reverb-heavy electric guitar. I watched his hands pluck and wiggle bluesy notes as they danced across the board. Clicking a button, he made the LED lights divide into green and blue. He triggered a drum beat from the computer and placed the fingers of his left hand on a chord of green lights. The propulsive sound of syncopated eighth notes emerged, which he accompanied by placing a single finger in the blue region. The electric guitar he had demonstrated earlier was now leading a trio. “I should also mention,” he said, “that if you don’t like blue and green, you can always change the colors. My niece prefers magenta.”

Linn concluded his demonstration, but rather than exit the stage he requested that the lights be dimmed. “I wanted to show people other types of multi-touch expressive controllers,” he said, cuing up YouTube. “As you’ll see, there are some pretty amazing ones out there.” Now began a parade of futuristic instruments. The Seaboard, with a continuous surface of pillowy silicone molded over touch sensitive controls:

The Continuum, an instrument that resembles a ceremonial red carpet, with major and minor keys of equal length running across eight octaves:

Linn then asked that the lights be switched on, and he invited onstage a Belgian software engineer and musician named Geert Bevin, who would demonstrate something called the Eigenharp. This turned out to be a four-and-a-half-foot-long controller finished in ebony wood. Dubbed the “sci-fi bassoon” by the BBC, the instrument features five rows of 120 LED-lit keys — besting a modern grand piano’s eighty-eight — coupled with a drum sequencer and a curvaceous blow pipe.

Bevin greeted the audience, strapped on the instrument, and performed a New Age–tinged tune. He tapped ambient percussive sounds on the keys while blowing out ocarina notes from the pipe, briefly transforming himself into a contemporary take on the busking one-man band:

I was impressed by the performance, but it had been the clip of the arresting solo played on the Continuum by its co-developer Edmund Eagan that had made me a believer. When I closed my eyes for a second, the music he played sounded exactly like a violin. His hands moved across the board without hesitation, knowing exactly where to find and apply vibrato to each note, bending pitches here and there. When it was over, the crowd stayed hushed. “Eagan’s my favorite violin player,” said Linn, “and he doesn’t even play violin.”

Each new electronic instrument is a statement about the inventor’s musical and aesthetic philosophy — a theory of how notes should be played, or how an instrument body and layout might look when liberated from traditional physical constraints. One instrument, the Opal Gecko, looks like a beehive and uses an isomorphic key layout as its basis (think of transferring the octaves of a piano onto a grid system).

Another, the Madrona Soundplane, is made almost entirely of finely polished wood, its electronics all concealed gracefully within.

By far the most punk-rock of the bunch is the Samchillian Tip Tip Tip Cheeepeeee, a keyboard founded on the principle of  relative pitch (no key is discordant to another), thereby thumbing its nose at the absolute pitch to which most instruments are tuned. An early model used recycled computer keys, built into a paint-splattered mold.

“In fifty years’ time,” Linn told me, “I think people will still be playing piano, guitars, and violins, but I also think they will be playing electronic instruments — actually performing virtuosic work upon them. What these will look or sound like, I don’t know.”

What music can we expect to hear from these new digital instruments? If a generation of people who grew up listening to plugged-in instruments and guitar solos have moved onto the computer, what happens when a generation of musicians who have learned to sample and manipulate automated audio clips are given the tools for expressive digital instrumentation? A new musical genre, perhaps, or new variations on the familiar. It’s a chicken-and-egg question, but it may take only one virtualoso to hatch the answer.

is an associate editor at New Directions and a drummer for the band Megafortress. He lives in Brooklyn.
More
Close
“An unexpectedly excellent magazine that stands out amid a homogenized media landscape.” —the New York Times
Subscribe now

Debug