Television has experienced remarkable and ongoing change since it first emerged in the mid-20th century.
H. Armstrong Roberts/ClassicStock/Getty ImagesYou're comfortably watching your favorite show on a sleek 50-inch HD TV, but this wasn't always the case. TV has undergone dramatic shifts in its relatively brief history. It began small: in 1948, only 0.4 percent of U.S. households owned a TV. Yet radio listeners soon made the switch, and within seven years, TV ownership skyrocketed to 55 percent [source: Johnson]. By 2014, TVs were in 116 million homes [source: Nielsen]. And remember when TVs didn’t even have color? (You probably don’t.)
These transformations aren't limited to screens and boxes. From production methods and equipment to financing and writing, significant advancements have been made since the early days. Keep reading to discover how TV today is vastly different from what your parents watched.
10: The Emmy Awards
Industry awards have played a crucial role in determining the longevity of TV shows.
Paul Drinkwater/NBC/Getty ImagesAnd the award goes to... well, the winner. In 2009, an Emmy Award trophy cost between $300 and $400 to produce, yet its true value lies far beyond its golden exterior. Emmy Awards hold significant power over the careers of individuals, networks, and shows that earn them. This influence took hold early in TV's history, and by 1950, the Emmys expanded beyond California to include productions from all over the U.S.
A show with modest ratings but critical acclaim from the Academy may receive renewed attention from its network. For example, when "30 Rock" won the Emmy for Outstanding Comedy Series in 2007, NBC began to recognize its potential for long-term success. The cast even got upgraded offices! Although the show didn't boast extraordinary ratings, its 103 nominations and 16 wins ensured it stayed on the air [source: The Hollywood Reporter].
An Emmy win can open doors for emerging or supporting actors. When Katherine Heigl received the award for her role in "Grey's Anatomy" in 2007, the film industry quickly took notice. The honor also carries significant prestige for smaller networks. In 2002, Michael Chiklis won for Outstanding Actor for his role in the gritty series "The Shield." This victory validated FX's reputation for quality programming, attracting more advertisers. Later, AMC proved itself competitive with major networks when "Mad Men" and "Breaking Bad" consistently won Emmys.
9: The Nielsen Ratings
Nielsen's tracking of American viewing habits is the primary gauge of success in the industry.
Michael Blann/Getty ImagesYou might have an incredible show, but is anyone watching it? No worries, Nielsen’s got it covered. They can tell you who’s tuning in, how long they’re watching, and what snacks they’re munching on (well, not yet!). The Nielsen Television Index began gathering data in 1950, using a meter attached to the TV to track what's being viewed and transmit the information to a central computer. Nielsen families are selected nationwide to match various demographic and racial breakdowns.
These TV ratings do more than just decide which shows get renewed; they're also crucial for advertising. Higher ratings mean advertisers are willing to spend more to get their commercials in front of a large audience. But it’s not just about numbers; demographics matter. Want to advertise a luxury car? You don’t want to waste your ad dollars on kids. Targeting the right audience is key. And remember, advertising dollars are everywhere – without them, an average "hour-long" drama is only about 40 minutes of actual content [source: Jacob].
8: Commercials
There’s no such thing as "free" network TV – it’s all funded by advertisers eager to reach audiences.
© Viviane Moos/CorbisThe first television commercial, a 10-second Bulova watch ad, aired in 1941, but TV ads didn't truly make their mark until the 1950s [source: Jacob]. Would people really stop listening to radio – and its ads – and switch to television? Advertisers weren’t sure. Should they simply add visuals to radio spots, or was there a different approach? As the decade progressed, Americans became enamored with television, and advertisers followed suit. Seeing truly was believing: Consumers had a much higher brand recognition from TV ads compared to radio ones. Companies started sponsoring entire programs, like the "Kraft Television Hour" and "Colgate Comedy Hour," which were expensive but showcased the sponsor prominently.
NBC introduced the "magazine concept" to solve the issue. Commercials from various companies, lasting one to two minutes, were scattered throughout a show. Sound familiar? By the 1960s, this became the norm. Producers had another reason to diversify advertisers: in the late 1950s, a scandal involving TV quiz shows emerged. Some shows with single sponsors, like "Twenty-One" and "The $64,000 Question," rigged their results. The public was outraged, and the era of single-sponsored shows came to a close. Looks like advertising ethics were never part of the quiz questions.
7: Television News
In the 1950s, news anchors Edward R. Murrow, Walter Cronkite, and Lowell Thomas became household names.
© CorbisEarly television took a serious tone. In the early 1950s, news broadcasts were brief and often accompanied by newsreel footage. Radio broadcasters transitioned to TV, becoming the face, not just the voice, of the news. These highly respected figures brought significant news stories into American homes. In 1954, for example, Edward R. Murrow began covering the McCarthy anti-communist hearings, helping bring the investigation to an end.
Evening news programs became a cornerstone in many American households. In 1963, Walter Cronkite led CBS's first 30-minute nightly news show. Later that year, NBC joined the competition with the "Huntley-Brinkley Report," and ABC didn’t join the race until 1967. That same year, a historic special news broadcast occurred: the funeral of President John F. Kennedy. More than 90 percent of U.S. homes with televisions tuned in [source: Audio Engineering Society].
6: Film Moves In
Imogene Coca and Sid Caesar performing a skit on the live-broadcast "Your Show of Shows" in 1953.
Michael Ochs Archives/Getty Images"Live from New York, it's Saturday Night!" This iconic NBC late-night show was unconventional for the 1970s, but the early TV era thrived on live performances. Forgetting a line? Tough luck, as cameras were rolling and the audience was watching. Variety shows like "Your Show of Shows" with Sid Caesar and Imogene Coca embraced the unpredictable thrill of live broadcasting. Early TV dramas, while simple, boasted profound depth, with notable writers like Paddy Chayefsky and Rod Serling, and actors such as Paul Newman and Angela Lansbury.
Despite the excitement of live television, by the 1950s, filmed shows began to replace live broadcasts. The ability to pre-record allowed new genres such as police, courtroom, hospital, and mystery dramas to flourish. Directors could experiment with various camera angles, and productions weren’t confined to studio walls. It wasn’t just about creativity; there was also a financial motive. Once a series was filmed, it became eligible for syndication, making it possible to revisit missed shows like "Petticoat Junction."
5: Saturday Cartoons
"The Flintstones" made history by airing in prime time, an unusual move for a show typically reserved for Saturday mornings.
ABC Photo Archives/Getty ImagesTV's appeal isn't just for today's kids. By 1951, television already offered 27 hours of children's programming each week [source: Museum of Broadcast Communications]. Much like radio before it, action-adventure shows like "Lassie" and "Sky King" found success. Puppets also captured attention, with shows like "The Howdy Doody Show" and "Kukla, Fran and Ollie." Initially, children's programs were brief, typically airing in the late afternoon or evening. By the mid-1950s, Saturday mornings had become the ultimate "kid time."
However, during the 1960s, most live-action children's shows were replaced by animated series. The reason? Economics: animation was much cheaper to produce than live action. These cartoons were churned out using an assembly-line approach, with simple movements and limited animation. As a result, Saturday morning became the prime time for young viewers. In 1960, "The Flintstones," a cartoon about prehistoric families, made its debut on Friday nights. But don't mistake it for a kids' show – this primetime animated series was designed for the whole family.
4: Pay TV
Viewers have become increasingly familiar with the idea of paying for "premium" content.
Yvan Cohen/LightRocket/Getty ImagesWhile broadcast television shows may not have been flawless, they were at least free. But that all shifted in 1972 when HBO, the first pay network, launched. Before then, the only options for viewers were the big three networks, PBS, and local independent channels. Suddenly, uncut, commercial-free movies were available at home. Over time, boxing events, comedy specials, and original shows followed. Unfortunately, it took a while for HBO to reach everyone, and not all of the country had access right away. Atlanta's WTBS joined the cable revolution soon after, and by the end of the 1970s, cable had 16 million subscribers [source: California Cable].
But how could the new system expand further? Deregulation legislation paved the way. The 1984 Cable Act spurred investment in pay television. Growth took off, and by the end of the 1980s, 53 million homes had cable, according to the California Cable and Telecommunications Association. But that was just the beginning. Satellite networks expanded in the 1990s, and by the dawn of the 21st century, 70 percent of U.S. households were enjoying cable – and much more.
Other options also emerged. Consumers gained the ability to stream television from various platforms, such as Amazon Prime, Netflix, and Hulu. Non-broadcast shows began to earn accolades, with series like "Homeland," "Breaking Bad," "Boardwalk Empire," "House of Cards" and "The Newsroom" dominating the 2013 Emmys for drama [source: Hughes]. This was a groundbreaking achievement, especially considering non-broadcast shows were excluded from competing until the late 1980s. Once they were in the game, anything went.
3: Recording Devices
The Hopper is a satellite receiver with multiple tuners, offering high-definition programming and DVR services.
Craig F. Walker/Getty Images"I'm sorry, you'll have to miss your show. We have to attend your uncle's birthday party." How many times did special events, work commitments, or schedule conflicts cause people to miss broadcast TV shows? Sure, reruns were available, but they never had the same excitement as new episodes. With the advent of the VCR in the 1970s, viewing habits changed. By the mid-1980s, a third of all U.S. households owned a VCR [source: Gendel]. This allowed people to watch more broadcast TV, with only 25 percent of tapes recorded from cable channels. HBO subscriptions dropped, but broadcast TV viewership increased by half a million households.
The TiVo digital video recorder emerged in 1999 [source: TiVo]. In general, DVRs led to more people watching scheduled shows – just not always at the scheduled time. Viewers enjoyed watching shows at their own convenience, though they had to be mindful of spoilers. The advertising world also faced new challenges. Since the cost of commercials depends on the number and demographics of viewers, how should DVR recordings be counted? When does watching "later" become too late?
2: Reality TV
The reality series "Millionaire Matchmaker" airs on Bravo, a cable channel.
Bravo/Getty ImagesKeeping up with the drama of the Voice and the antics of the Real Housewives in some sort of Amazing Race... or something like that. Reality TV has undeniably changed the landscape of television. The pioneer in this genre, MTV's "The Real World," first aired in 1992. While the cast used their own words, the show's creators shaped the narrative, offering the network’s version of what 'reality' looked like. To ensure maximum drama, producers often manipulated situations.
In 2000, competition-based shows like "Survivor" and "Big Brother" emerged. Other reality TV formats quickly followed, including dating shows like "The Bachelor" (2002), informational programs such as "The Dog Whisperer" (2004), makeover series like "Queer Eye for the Straight Guy" (2003), lifestyle shows like "The Biggest Loser" (2004), and talent competitions like "America's Got Talent" (2006).
Not only were reality shows wildly popular, but they were also inexpensive to produce. By 2009, the budget for a typical one-hour drama episode ranged from $1 to $2 million, and could easily soar for high-budget series with big-name casts or elaborate sets. In comparison, a reality show could be produced for a much more modest $100,000 to $500,000 per episode [source: Gornstein]. However, the tide may be turning. While the number of unscripted shows has risen, fewer are achieving lasting success [source: Hibberd]. That's the harsh truth of reality TV.
1: No TV Required
Even those who watch shows on their laptops or smartphones still call it "watching TV."
© Doreen Fiedler/dpa/CorbisTo create a TV show, you typically need a camera, cast, crew, and a set—but not necessarily a traditional TV screen. In this era of advancing technology, you don't have to rely on a TV to watch your favorite shows. With the advent of smartphones, tablets, and computers, television has become portable. Streaming services, including network apps, have made it even more accessible. While phones and computers are the preferred choices, the popularity of tablets is on the decline [source: Technalysis]. Computers serve many functions, and phones, being more compact, are easy to carry around.
For many viewers, cutting back on traditional TV can also lead to savings. Free streaming options like Hulu and Crackle are available, and while some services do require a subscription, they still cost far less than a cable bill. In 2011, the average cable bill was around $86 per month and was expected to rise. However, for that same amount, you could enjoy an entire year of Hulu Plus or Netflix [source: O'Connor]. Portable, cost-effective—what's not to love?
