David Redfern/Hulton Archive/Getty Images
Synthesizers are undoubtedly one of the most groundbreaking innovations in the evolution of modern music.When we think of synthesizers, it's hard to avoid certain images. We might recall the electronic bleeps of 8-bit video games or the infectious pop hits from the 1980s. We might picture a rich, digital orchestra flowing from a keyboard. We envision a collection of knobs, dials, and cables scattered about. Perhaps we even imagine a software tool controlled by a computer’s keyboard.
Whatever comes to mind, the nearly five decades since the first commercial synthesizer hit the market have seen its influence penetrate deeper than many of us can fathom. Synthesizer-generated sounds have become inseparable from the music we hear today—whether it’s in pop hits, hip hop, film scores, or rock music. But as Dr. Tom Rhea, a professor in the Electronic Production and Design Department at the Berklee College of Music, points out, synthesizers have reshaped music not only through the sounds they produce but also by altering how music is performed. Traditional instruments like guitars, cymbals, and clarinets are designed with specific physical constraints to create distinct sounds. “With electronic instruments — specifically synthesizers — those limitations disappear,” Rhea explains [source: Rhea].
In other words, a synthesizer is capable of producing a wide array of sounds. It can generate both recognizable and fantastical tones—such as the sound of a flute, a crashing wave, or even a Martian's ray gun—along with voices never before imagined. Synthesizers achieve this by altering and merging the core characteristics of sound to form entirely new creations. Despite common belief, the term "synthesizer" does not suggest that its sounds are artificial. Instead, it refers to the process of synthesis, which involves combining different fundamental elements of sound to produce a new, unified result.
While we're clearing up misconceptions, let’s tackle another: Synthesizers are not mystical devices that automatically generate music. At their core, synthesizers are just like any other musical instrument, requiring a skilled operator to create the music we hear.
What are the basic elements that make up a sound, and how do synthesizers alter them? Let's explore.
Breaking Down the Elements of Sound
When we say synthesizers manipulate the basic components of a sound, what exactly are we referring to?
To start, let's go over some basics. Sound is created by changes in air pressure as energy travels from the source of the sound to our ears. The human ear can detect sounds within a frequency range of 20 to 20,000 hertz. Every sound we hear has unique qualities, including pitch, timbre (or tonal quality), and loudness. Even when two instruments play the same note, their sounds differ in measurable characteristics like frequency (the number of waves per second), amplitude (volume or change in air pressure), wavelength (distance between waveform cycles), and period (time it takes for one complete wave cycle). Sounds also contain harmonics, additional frequencies that blend to create a fuller, richer sound. Additionally, the volume changes over time—this includes the phases of attack, decay, sustain, and release (ADSR).
Earlier, we discussed that the term synthesizer comes from "synthesis." There are many types of synthesis techniques, but let’s focus on one commonly used method known as subtractive synthesis. In this process, a musician starts with a waveform—a sound that includes all the characteristics mentioned earlier—and removes elements until the desired tone is formed. The musician can fine-tune the synthesizer’s settings to silence certain frequencies or boost others. As a result, subtractive synthesis can reshape the initial waveform into a completely different sound. Once processed through the synthesizer, it might resemble sounds like a trumpet, snare drum, atmospheric effects, or just about anything else. (However, unless you’re using a sampler—an electronic instrument that records and manipulates actual sound samples—synthesized versions of real-world instruments won’t be exact replicas.)
Now that we understand how synthesizers manipulate sound, let's dive deeper and examine the internal components of a synthesizer.
In contrast to subtractive synthesis, additive synthesis works by layering tones on top of each other to build a more complex sound.
Synthesizer Components
Jay Blakesberg/Workbook Stock/Getty Images
What comes to mind for many music enthusiasts when they hear the term 'synthesizer.'Although many synthesizers feature the familiar piano-like ebony and ivory keyboard, the rest of the instrument—complete with knobs, dials, and switches—looks more fitting for a workshop than a performance stage. Still, much like other musical instruments, synthesizers have two core components: a generator and a resonator. Take a violin, for example: the strings and bow act as the generator, while the body of the violin serves as the resonator [source: Rhea]. In the case of a synthesizer, the generator is the oscillator, and the resonator is the filter.
Let’s begin by looking at the basic components of a traditional analog synthesizer. (We’ll cover digital synthesizers in due time.) Analog synthesizers produce sounds by manipulating electric voltages. The oscillator shapes the voltage to generate a consistent pitch at a set frequency, which defines the basic waveform that will be further processed by other parts of the synthesizer. The oscillator can be controlled using keys similar to a piano’s keyboard, a rotating pitch wheel, or another interface tool. The signal produced by the oscillator is sent to the filter, where the musician adjusts the sound’s frequency parameters—such as emphasizing or removing specific frequencies as we discussed earlier. From there, the sound travels to the amplifier, which regulates its volume. The amplifier typically includes a set of envelope controls that determine the dynamic range of the note over time.
In an analog synthesizer, each function related to pitch, timbre, and volume is contained within a module, a specialized unit designed for a particular task. The first synthesizers featured individual modules encased in separate enclosures. Each module generates or processes a specific signal, and by connecting these modules together, the musician can layer, adjust, and transform the sound into something entirely new.
Now that we've explored how synthesizers operate, let's take a look at their historical development.
Early History of Synthesizers
When was the first synthesizer created? It depends on who you ask.
Some people point to the Telharmonium or the theremin, which were invented in the late 1890s and 1919, respectively. However, Rhea argues against their inclusion, saying that these instruments did not offer the operator complete control over the elements of sound [source: Rhea]. The first true synthesizer, according to Rhea, was a combination of piano and electronic technology developed in France in 1929 by Armand Givelet and Eduard Coupleaux. This device used a paper tape reader and electronic circuits to manipulate sound, creating an orchestra of four voices [source: Rhea]. The first time the term "synthesizer" was used to describe an instrument was in 1956, with the release of the RCA Electronic Music Synthesizer Mark I, which used tuning forks and punched paper tape to produce sound through loudspeakers [source: Apple].
Robert Moog is often regarded as the father of the modern synthesizer. Moog, an American electrical engineer, initially worked on building electronic instruments like theremins. In the early 1960s, after meeting musician Herbert Deutsch, Moog began working on the first commercially available synthesizer. Released in 1964, Moog's 900 Series Modular Systems resembled large mainframe computers, with a network of cables used to 'patch' the various modules together to produce a full sound. These sounds could be both sequenced and performed in real time.
Originally marketed to academics and experimental musicians, synthesizers were met with resistance early on. "As a salesperson, I went into music stores where I was practically thrown out, and I was told that [the synthesizer] wouldn't be a musical instrument," says Rhea, who spent many years working with Moog in various roles [source: Rhea]. However, in 1968, the Grammy-winning album "Switched-On Bach" by Wendy Carlos demonstrated the musical potential of synthesizers to a wider audience. Over the years, bands like Parliament-Funkadelic, the Mahavishnu Orchestra, and Emerson, Lake, and Palmer began incorporating synthesizers into their music. The Minimoog, which combined elements of larger synthesizers into a more compact and affordable form, enabled over 13,000 synthesizers to be used by performing musicians. Even after the introduction of digital synthesizers, musicians continue to celebrate Moog and his creations at the annual Moogfest in Asheville, N.C. [source: Pareles].
Turn the page to learn why digital synthesizers largely replaced their analog predecessors.
Alongside Robert Moog, Don Buchla and Alan R. Pearlman were key figures in the development of the modern synthesizer. Buchla's 100 Series synthesizers, which were released around the same time as Moog's first models, used pressure-sensitive touch plates to control its modules. Pearlman's ARP2500 and 2600, which were more compact versions of modular synthesizers, came onto the scene in the early 1970s. While Buchla's instruments found favor with avant-garde musicians and academics, Pearlman's models became popular with rock bands. However, neither of these creations achieved the widespread popularity of Moog's synthesizers.
Going Digital
The affordability of digital synthesizers has made it possible for musicians around the world to explore and create unique sounds.It's no accident that most synthesizers available today are digital. It's also not because their technology is inherently superior to analog, as Rhea explains (though he concedes that digital synthesizers often offer a more stable pitch). Rather, it was simply an economic decision: Digital instruments could be produced and sold at a much lower cost. "The average musician playing at a Ramada Inn is not going to buy something that costs $15,000," Rhea says [source: Rhea].
The internal technology of a digital synthesizer marks a distinct departure from its analog predecessors. Digital synthesizers rely on processors and algorithms programmed into the devices, which interpret binary code and convert it into sound waves. Digital music research began as early as 1957, when Max Mathews of AT&T's Bell Laboratories wrote Music I, the first computer program capable of playing a musical piece [source: Schofield]. Commercial digital synthesizers first appeared in the 1980s, with the Yamaha DX7, released in 1983, becoming a popular early model. Over time, digital synthesizers became essential tools for producers and musicians creating hip-hop, pop, rock, and electronic music. Composers also use synthesizers to score films, whether crafting initial sketches to later be filled out with live instruments or creating complete soundscapes with synthesizers alone.
Over time, digital synthesizers have evolved into various forms – some are external devices that can be physically connected to desktop computers, while others are software programs that rely entirely on the computer’s hardware to function. (A virtual analog synthesizer, for instance, mimics the interface of an analog synthesizer, complete with knobs, dials, and a keyboard, but uses digital technology for its operations.) Soon, digital synthesizers were joined by other technologies: MIDI (musical instrument digital interface), introduced in 1983, allows synthesizers to connect with sequencers, samplers, digital audio workstations like Avid’s Pro Tools and Apple’s Logic, drum machines, and a variety of other electronic music tools and software.
Synthesizers have made it possible for anyone with the desire to make music to do so. While this has reduced the barriers to creating music, it also means that even your neighbor, whose voice may sound like broken glass, can easily produce songs just as easily as a Julliard-trained singer. And that’s the essence of synthesizers, according to Rhea: "the democratization of music, with the concomitant horrors and wonders" [source: Rhea].
