Certain trends persist for years—or until we finally recognize the dangers they pose. From the recent anti-vaccination wave to the newer crazes like diet pills and vaping, we’ve made numerous errors that, in hindsight, were clearly unwise.
10. Avoiding Vaccinations

The anti-vaccination movement continues to gain traction despite its flawed arguments. Supporters highlight rare, non-lasting side effects like seizures and falsely claim a connection between vaccines and autism, a theory debunked and based on a single discredited study. This movement does more harm than good, leading to preventable illnesses for countless individuals.
Measles was officially eradicated in the United States in 2000. However, the Centers for Disease Control reported 288 cases of this potentially fatal illness in 2014 across 18 states. Many cases arose when unvaccinated individuals traveled abroad, contracted the disease, and brought it back. The Amish community, particularly through unvaccinated missionaries, was among the most affected by this outbreak.
Vaccines are life-saving. Over the past two decades, they have averted an estimated 732,000 deaths.
9. Sporting Muslin

Muslin, a flexible cotton fabric originating in the Middle East, gained popularity in Europe during the 17th century. When it reached France, it introduced a lethal fashion craze.
Fashion choices were already constrained by sumptuary laws, which dictated what French citizens could wear. Women began donning lightweight, translucent muslin dresses inspired by ancient Greek attire. The aim was to emulate the appearance of Greek statues: pristine, white, and marble-like. These muslin dresses were often layered over tights and sometimes worn damp to accentuate the body's contours.
While it may have seemed like a harmless, albeit revealing, trend, it led to what became known as muslin disease. Women wore these thin, wet garments year-round, even during winter. When influenza hit Paris in 1803, it infected around 60,000 people daily, primarily women whose immune systems were weakened by their fashionable yet perilous clothing.
8. Asbestos

Many buildings still contain asbestos, but this fire-resistant material isn’t a modern invention. Its use dates back to 4000 B.C., where its slow-burning properties made it ideal for candle wicks. Ancient Egyptians used it to preserve their dead, while Greeks wrapped funeral pyres in asbestos cloth to keep human ashes separate from the fire’s remnants.
Across Europe, clay cooking pots were often lined with asbestos, and in ancient Rome, asbestos cloths could be cleaned simply by tossing them into flames. Charlemagne famously used asbestos tablecloths to prevent fires during his feasts, and Crusaders launched flaming tar wrapped in asbestos from their catapults. Once believed to be a product of a mythical fire-breathing lizard (a notion debunked by Marco Polo), asbestos was even favored by vendors of sacred relics for its ability to give wood an aged, weathered appearance, making it resemble fragments of The Cross.
The hazards of asbestos have been recognized since ancient Greece, where miners wore masks crafted from animal organs to avoid inhaling its fibers. However, it wasn’t until the 1970s, with the rise of mesothelioma cases, that the asbestos industry was finally halted.
7. E-Cigarettes

E-cigarettes deliver the same nicotine rush as traditional cigarettes but without the harmful tar. Even critics of vaping, who argue it might deter smokers from quitting nicotine entirely, concede it’s a safer alternative. Yet, users who believe they can inhale chemicals for a nicotine buzz without consequences are sorely mistaken.
The vapor from e-cigarettes may include chemicals such as formaldehyde and acetone, along with irritants for the eyes and respiratory system like propylene glycol. Although secondhand vapor contains lower levels of these substances compared to secondhand smoke, it still presents health hazards. These risks escalate when the e-cigarette is used at higher settings.
In addition to the dangers of inhaling vapor, e-cigarettes expose users to a highly concentrated, toxic chemical, and many have mishandled its storage or usage. In just one month in 2014, poison control centers received over 200 calls after children accidentally ingested the substance or had skin contact. Pets are equally vulnerable; chewing on a single cartridge can lead to seizures, cardiac arrest, or even death within 15 minutes, depending on the animal’s size. Users must store e-cigarette cartridges safely, away from children and pets, just as they would with alcohol or other hazardous materials.
6. Dietary Supplements

Many people rely on their vitamin C and calcium supplements, believing them to be beneficial, but some of these so-called health aids can have the opposite effect.
In 2013, the Food and Drug Administration banned several weight loss supplements after discovering they contained dimethylamylamine (DMAA), a stimulant linked to 86 cases of psychiatric issues, nervous system failures, and fatalities. Even after the risks were identified, one company continued producing the drug until the FDA intervened directly.
The FDA does not evaluate supplements before they are sold. Although proposals have been introduced in Congress to mandate such reviews, consumers currently have no choice but to trust manufacturers. Supplements may be ineffective at best or contain harmful substances at worst.
5. Radium Watches And Dials

Radium was first identified in 1898 by Marie Curie. It forms naturally during the decay of uranium and, when purified, emits a distinctive glow in the dark. In the early 1900s, this luminescence captivated the public, leading to its use in the earliest glow-in-the-dark watches. Soldiers in World War I trenches relied on these watches to tell time after dark, and radium was also applied to industrial dials, such as those on ship and airplane control panels, for clear visibility.
Most dial painters were young women tasked with painting around 250 watch dials daily. Many habitually shaped their paintbrush tips by licking them, while others used radium to streak their hair for a glowing effect. Over time, these women began to suffer severe health issues: teeth fell out, sores appeared, and the bones in their faces disintegrated.
In 1924, Harvard University and the US Radium Company conducted the first investigation into radium’s effects. Their study claimed the deaths of the factory’s young women were unrelated to radium. However, when the Consumers League of New Jersey intervened, aided by impartial doctors, they discovered radium was indeed hazardous, as were the factory’s working conditions. Turning off the lights revealed the women were coated in radium dust, glowing like the dials they painted. Their exposure was so extreme that they were exhaling radon gas with every breath.
4. Mercury

The expression “mad as a hatter” originates from the use of mercury in hat production during the 19th century. Initially, hatters used camel urine to separate fur from animal hides for felt, as the urea’s chemical properties helped detach the hairs. Eventually, manufacturers questioned the need for camel urine when human urine was more accessible, leading to a switch in materials.
Over time, it became evident that felt-makers with syphilis produced superior-quality felt. These workers treated their condition with mercury, which contaminated their urine. When this mercury-laced urine was applied to animal skins, it facilitated easier fur removal and caused less damage to the hides. This discovery prompted felt-makers to abandon urine entirely and adopt mercury nitrate.
The practice was outlawed in the United States in 1941—not due to its health risks, which had been recognized since 1874, but because mercury was required for weapons production during wartime, leading the government to reallocate it for military use.
3. Skin Care

Beauty may only be skin deep, but humanity has repeatedly gone to great lengths, often harmful, to achieve it. For centuries, people have prioritized beauty standards over safety, leading to dangerous practices.
The history of hazardous skin care dates back to feudal Japan, where lead- and mercury-based cosmetics were used to create the coveted pale complexion. These powders and paints remained in vogue until the 18th century, when concerns arose about their side effects, including lead poisoning and neurological damage.
Around the same time, Western beauty ideals shifted. Pale skin, once a symbol of an indoor, labor-free lifestyle, became associated with the working class confined to factories and mines. A tan, however, signified leisure and health, as it suggested time spent outdoors in the countryside.
Tanned skin grew increasingly desirable, especially after Coco Chanel popularized sunbathing. However, we now know that approximately 90 percent of skin cancer cases are caused by excessive sun exposure. Prolonged exposure also accelerates aging, weakens the immune system, and harms the eyes.
2. Carving Pumpkins

Halloween traces its roots to the Celtic festival of Samhain, celebrated on October 31 to mark the end of the harvest season. It was a time to prepare for winter, with bonfires lit to shield the living from wandering spirits. Over time, these large fires were replaced by smaller, safer versions contained within hollowed-out turnips.
When the tradition reached North America, turnips were scarce, but pumpkins were plentiful. This led to the adaptation of using pumpkins to hold the fires, and carving these thick-skinned gourds introduced new risks. Consumer Reports notes that roughly one-third of Halloween injuries occur during pumpkin carving, ranging from minor cuts to severe tendon damage.
A research team from SUNY Upstate Medical University investigated the potential harm caused by various carving knives. Unable to test on live subjects for ethical reasons, they used cadaver hands and a hydraulic press to simulate carving. They measured the force required to cut a pumpkin versus the force needed to injure a hand. While knives marketed for pumpkin carving were generally safer than standard kitchen knives, caution is still essential to prevent turning the night into an unintended horror.
1. Cadmium Paint

Cadmium sulfide, a key ingredient in many yellow paints, was highly favored by Impressionist artists such as Van Gogh, Monet, and Matisse. At the time, the long-term effects of cadmium-based paints were unknown, leading to the deterioration of numerous Impressionist works as the cadmium sulfide breaks down. Despite discovering that the compound degrades and alters color over time, its use continued.
While cadmium sulfide causes paintings to fade and change, some paints contain pure cadmium metal, which poses health risks. Cadmium is a harmful carcinogen. In 2010, McDonald’s faced backlash for using cadmium paint on Shrek-themed merchandise, resulting in a recall of 12 million items.
