Include greens in your diet. Ensure you get eight hours of sleep each night. Stay physically active.
Numerous health adages are so ingrained in our minds that we rarely pause to verify their accuracy. Additionally, age-old myths handed down through generations often blur the line between reality and myth.
Here’s a compilation of 10 widespread health myths. Surprisingly, none of them hold any truth.
10. The Myth That We Only Use 10 Percent of Our Brain Capacity

Tipping the scales at just over 1.4 kilograms (3 lb), the human brain houses approximately 100 billion neurons. These neurons communicate across nearly one quadrillion synapses, facilitating the transfer of information.
The brain is divided into three main regions—the cerebrum, the cerebellum, and the brain stem. The cerebrum, making up about 85 percent of the brain, handles advanced cognitive functions that define human intelligence. Below it lies the cerebellum, which manages coordination and balance. Lastly, the brain stem, linked to the spinal cord, oversees involuntary processes like breathing and digestion.
Imagine if all this complex activity only utilized 10 percent of the brain’s capacity—how astonishing would that be?
Unfortunately, this notion is entirely false. The origin of the claim that we use only 10 percent of our brains is unclear, but it appears to have emerged during the late Victorian era. In the 1890s, Harvard psychologists William James and Boris Sidis used the latter’s prodigy (with an IQ nearing 300) as evidence that humans possess untapped intellectual potential, suggesting we simply need to strive harder.
Quite absurd, isn’t it?
Early 20th-century studies revealed that rats with brain injuries could relearn specific tasks, which some used to weakly argue that the human brain harbors vast untapped potential. However, this claim is entirely baseless and unsupported by modern science. Simply reading this paragraph engages far more than 10 percent of your brain. Oh well.
9. The Myth That Gum Takes Seven Years to Digest

Many of us recall the childhood fear of being told that swallowing a large piece of bubblegum would take seven years to digest. If you’re still waiting, rest assured—this “fact” is entirely false.
Though the origins of this myth remain unclear, it does highlight a truth about chewing gum: it’s indigestible. According to the Food and Drug Administration, gum is classified as a “nonnutritive masticatory substance,” meaning it’s not considered food.
While swallowing gum isn’t recommended, the process isn’t particularly dramatic. Sweeteners and other additives may be absorbed, but the main component, an elastomer, passes through your digestive system intact. Eventually, it exits your body through the excretory system, often unchanged.
For a foreign, inedible object to become lodged in your digestive tract, it typically needs to be larger than a U.S. quarter. Smaller items pass through effortlessly, exiting the body without issue.
8. The Myth That Chocolate Causes Acne

Navigating puberty, high school, and adolescence is challenging enough without being told that indulging in chocolate directly causes acne. It’s disheartening to think that the one treat making those years tolerable could also lead to unsightly breakouts.
Fortunately, this age-old belief is unfounded. Consuming chocolate does not trigger acne. However, diets high in fat and sugar can boost sebum production, leading to oilier skin. Additionally, such foods may increase skin inflammation.
But does chocolate—or any specific food—directly cause breakouts? Absolutely not. While excessive consumption of fatty foods can disrupt blood sugar levels and indirectly influence acne, no single food is the key to preventing teenage pimples.
7. The Myth That Carrots Enhance Vision

The belief that carrots can sharpen your eyesight stems from a fascinating yet misleading piece of wartime propaganda. While carrots are rich in beta-carotene, which converts to vitamin A during digestion and supports eye health, their impact on vision is often overstated.
But do they truly enhance nighttime vision?
Not at all. During World War II, the British Ministry of Information spread the idea that Royal Air Force pilots consumed massive amounts of carrots, attributing their success in shooting down German planes at night to this diet. In reality, no amount of carrots can grant night vision.
The British were actually using advanced technology—airborne interception radar—to detect enemy bombers. It’s doubtful that German intelligence believed the British pilots’ prowess was due to a carrot-heavy diet.
Despite nearly a century passing, many in the Western world still cling to the belief that consuming large amounts of carrots will improve their eyesight. Unfortunately, no matter how many carrots you eat, night vision remains an unattainable dream.
6. The Myth That Humans Have Only Five Senses

This seems straightforward, doesn’t it? Think again. The idea that humans possess only five senses traces back to Aristotle, the Greek philosopher who first categorized them. You likely memorized them in school: sight, hearing, smell, touch, and taste.
While these are indeed five of your senses, they aren’t the only ones you possess.
To begin, let’s define what a “sense” is. Essentially, it’s a sensory mechanism that detects specific stimuli. Each sense is triggered by a distinct phenomenon.
In reality, the sense of touch encompasses far more than a single sensation. Neurologists often divide it into distinct perceptions, such as pressure, temperature, and pain.
Depending on the expert, humans may have up to 33 senses. These include sensations like blood pressure and balance, which you might not have considered as traditional “senses.” So, the next time someone mentions a sixth sense, you can confidently say you have 33. They might not understand, but you’ll know exactly what you mean!
5. The Myth That Tongue Rolling Is Genetically Determined

Many of us recall being taught in biology class that the ability to roll our tongues was purely genetic. While most people can roll their tongues, the belief that it’s a dominant genetic trait—passed down from parents—has been widely accepted.
However, the truth is more nuanced. Unlike many myths about the human body, this one has a clear origin. In 1940, geneticist Alfred Sturtevant published a study claiming that tongue-rolling ability was inherited through a dominant gene.
However, Sturtevant’s excitement about his discovery didn’t last long. It soon became clear that there were identical twins where one could roll their tongue and the other couldn’t. This quickly discredited Sturtevant’s findings, and he himself admitted the flaw in his study.
Yet, decades later, this myth continues to be taught in classrooms worldwide. Now that you know the truth, you can help stop this misinformation the next time someone brings up this quirky trick.
4. The Myth That Most Body Heat Escapes Through Your Head

From the myth that we only use 10 percent of our brains to the belief that we lose most of our body heat through our heads, it seems our skulls are unfairly targeted. One theory suggests this myth originated from 1950s studies where subjects exposed to cold temperatures lost significant heat through their heads.
The issue with this research is that participants were fully clothed except for their heads. Naturally, if only your head is exposed, you’ll lose more heat through it.
Recent studies, however, reveal that under normal conditions, your head doesn’t lose an excessive amount of heat. About 7 percent of your body heat escapes through your head, which aligns with the fact that your head makes up roughly 7 percent of your body’s surface area.
So, treat your head just like any other part of your body. When it’s cold, cover it up, and you’ll be perfectly fine.
3. The Myth That Shaving Makes Hair Grow Back Darker

Almost everyone has heard this claim: shave your beard—or for women, the hair on your legs—and it will grow back thicker and darker. This advice, however, is entirely untrue.
This myth has been debunked for decades. One of the earliest modern studies on this topic dates back to 1928. Men in the study shaved uniformly using the same shaving cream, and their regrown hair was analyzed for changes in growth rate and thickness.
Much of this myth stems from perception. As hair regrows, our preconceived notions can influence how we see it. Additionally, shaving or waxing hair leaves behind a blunt tip, making it appear thicker and more noticeable, which can create the illusion of increased growth.
Any perceived changes in hair growth speed are likely due to hormonal shifts, not shaving. Otherwise, it’s purely a trick of the mind!
2. The Myth That Knuckle Cracking Causes Arthritis

Arthritis isn’t a single disease but an umbrella term for various conditions involving joint pain, swelling, and inflammation. It’s widespread, impacting over 50 million adults and 300,000 children in the U.S. Symptoms can range from mild discomfort to severe, chronic pain.
While avoiding activities linked to arthritis is wise, cracking your knuckles isn’t one of them. Despite common belief, knuckle cracking doesn’t contribute to arthritis.
But what exactly is knuckle “cracking”? The popping sound comes from bubbles collapsing in the synovial fluid, which lubricates your joints. Despite how it sounds, a comprehensive analysis by Harvard Medical School found no connection between knuckle cracking and arthritis.
That said, you might still want to break the habit. Frequent knuckle cracking has been linked to reduced grip strength, and let’s face it—it’s irritating to those around you.
1. The Myth That Hair and Fingernails Continue Growing After Death

This “fact” about the human body is undeniably creepy. The notion that keratin-based hair and nails keep growing after death is unsettling. However, it’s completely false.
After death, the body quickly dehydrates, causing the skin to shrink and wrinkle. This creates the illusion that hair and nails are still growing, when in reality, the body is simply contracting. To prevent this, morticians often apply moisturizers to keep the skin from shriveling.