We often perceive science as a field of undeniable truths, where something is either proven or it’s not. However, the reality is that the universe, when examined through the lens of science, is far more intricate and unpredictable. At times, facts themselves seem to defy consistency, with results from one study sometimes contradicting those of another. This article explores a few of those puzzling contradictions within the scientific realm.
This is an appropriate moment to remind everyone of the argumentum ad populum fallacy— the idea that something is true simply because the majority believes it. The term 'scientific consensus' is frequently used as evidence, but it doesn’t guarantee truth. In fact, when someone resorts to the 'consensus' argument, it’s worth scrutinizing their entire point, as relying on one logical flaw might mean they’re clinging to several others as well.
10. Beer: A Miracle Cure or a Dangerous Tonic?

Few things excite us more than learning that our guilty pleasures might actually have health benefits. What could be more captivating than headlines claiming that science has proven beer is good for us, encouraging us to drink more of it? Thankfully, this isn’t just clickbait. Real scientific studies have suggested actual health benefits tied to the world’s beloved hoppy beverage.
One study published in the International Journal of Endocrinology highlighted a potential link between silicon and bone health. The theory proposed was that silicon dioxide (SiO2) plays a role in the body’s ability to form bone tissue. Rats with a sufficient supply of silicon showed better calcium absorption in their bones compared to rats that were silicon-deficient. Silicon is found in various foods like grains, cereals, green beans, and, yes, beer. In short, beer may strengthen your bones.
Beyond silicon, beer contains other beneficial compounds. A paper in Mutation Research/Fundamental and Molecular Mechanisms of Mutagenesis discussed the impact of Xanthohumol, a compound found in beer. This substance has been shown to protect the liver and colon from cancer-causing mutagens present in cooked food. In simple terms, beer might help in the fight against cancer.
Additional research on beer suggests that moderate consumption can reduce inflammation, help prevent kidney stones, and the silicon in beer may also offer protection against Alzheimer’s Disease. One might think beer is the ultimate health drink—except...
In 2018, an extensive global study was conducted to examine the health impacts of alcohol on individuals and populations. The research, involving 500 collaborators from 40 countries and 694 data sources, concluded that while beer may have some health benefits, alcohol still caused 3 million deaths worldwide in 2016. Alcohol was responsible for 12% of all deaths in men aged 15-49, and for the general population, it ranked as the 7th leading cause of death globally.
Dr. Emmanuela Gakidou, the senior author of the study, summarized the findings by stating, “The health risks associated with alcohol are massive. Our findings align with other recent research, showing clear and strong links between alcohol consumption and premature death, cancer, and cardiovascular issues.”
What is the safe amount of alcohol? Dr. Gakidou concluded, “Zero alcohol consumption minimizes the overall risk of health loss.” In other words, no amount of alcohol is entirely risk-free when it comes to premature death.
9. Coffee: A Double-Edged Sword for Glaucoma

A study published in the Journal of Agricultural and Food Chemistry examined the benefits of chlorogenic acid, a key compound found in raw coffee. The research showed that this acid can protect the eyes from retinal degeneration caused by glaucoma, aging, and diabetes. This protective effect can slow vision loss and even prevent blindness. In the study, mice exposed to nitric oxide (which induces retinal damage) were shielded by chlorogenic acid, whereas untreated mice suffered retinal damage.
Dr. Robert Bittel, chair of the American Osteopathic Association, remarked on the study, saying, “As with any study that presents commonly consumed foods as potential therapies, caution must be exercised so that the public understands both the positive and negative aspects of drinking coffee.”
While coffee may protect against glaucoma, it also has the unfortunate side effect of increasing the risk of developing the disease in some individuals. For the majority, this risk is not significant. However, a study in Graefe’s Archive for Clinical and Experimental Ophthalmology found that coffee worsens glaucoma in those who already have it. Other research showed that women with a family history of glaucoma, but who have not yet developed it, have an increased risk of the disease if they consume coffee regularly.
For some individuals, coffee serves as both the cure and the poison.
8. Stretching Before Exercise: Either a Hindrance or Irrelevant to Performance

For years, stretching before exercise was a standard practice, believed to enhance performance. It was so universally accepted that it became a part of physical education curriculum. However, when research finally started to examine its effects, the results were surprising. In one study, two groups of trained athletes ran one mile under different conditions: one group performed a series of six lower body static stretches, while the other group did not stretch. The outcome? The non-stretching group finished their mile significantly faster—about 30 seconds ahead of the stretching group. The study concluded, “Static stretching decreases performance in short endurance bouts…Coaches and athletes may face reduced performance after static stretching, so it should be avoided before short endurance activities.”
In a separate study published in Medicine & Science in Sports & Exercise, researchers sought to explore the broader effects of stretching beyond running. They had 20 participants perform a comprehensive set of stretches and warm-ups, targeting seven lower body areas and two upper body regions, alongside a control group. Afterward, participants underwent various tests, measuring flexibility, run times, vertical jumps, and agility. The study found that stretching had no measurable effect on athletic performance.
However, stretching did have a psychological effect. The participants who stretched believed it would improve their performance significantly compared to those who didn’t stretch. Despite this belief, stretching only served to boost their confidence, but did not translate into any real improvement in performance. According to this study, stretching may help you feel better, but don’t expect it to provide a competitive edge.
7. Nose Picking Is Harmful, But Eating Your Boogers Might Be Healthy

Despite being seen as a repulsive habit, nose picking is more common than we care to admit. A survey of 200 teenagers in India revealed that every one of them regularly engaged in rhinotillexomania (the medical term for nose picking). But it’s not just about social embarrassment—there are real health concerns. A study published in Infection Control & Hospital Epidemiology examined 238 healthy patients and 86 hospital staff members regarding their nose-picking habits. The results showed that frequent nose pickers had higher levels of the dangerous bacteria *Staphylococcus aureus* in their nasal passages.
Though around 30% of the population carry *Staph* bacteria without issue, it can become dangerous if it enters the body through a wound, potentially causing severe infections. This study underscores that nose picking increases the risk of introducing these harmful bacteria into the body, making it a health hazard.
But what if we didn’t just stop at picking our noses? What if we made use of what we find? One study titled “Salivary Mucins Protect Surfaces from Colonization by Cariogenic Bacteria” highlighted the protective role of mucins in the body. These mucus proteins help shield surfaces like teeth from harmful bacteria. Where can we find a rich supply of these protective mucins? In our dried nasal mucus—aka boogers. Not only do they help protect our teeth when consumed, but there’s evidence suggesting that eating them may prevent respiratory infections, stomach ulcers, and even HIV.
Friedrich Bischinger, an Australian lung specialist, commented on this research, saying, “In terms of the immune system, the nose acts as a filter, collecting a great deal of bacteria, and when this mixture reaches the intestines, it functions just like medicine.”
Whether the benefits of eating boogers outweigh the risk of staph infections is ultimately a personal decision. However, the author of the study on mucus consumption also suggests that we could create a synthetic version of salivary mucus to achieve similar health benefits. In the future, we might be able to have our boogers and eat them too.
6. Chocolate: A Miracle Food That Might Be Ruining Your Health

Chocolate is one of the world’s favorite treats, with around 72 million metric tons consumed annually. Given its popularity, it's no surprise that chocolate is extensively studied, with countless reports about its potential health benefits. Scientific papers seem to never run out of ways chocolate could improve our well-being.
Some studies have found that chocolate can help prevent cardiometabolic disorders and cardiovascular diseases, enhance cognitive function in older adults, lower blood pressure, and even protect the skin from UV-induced erythema.
One study even demonstrated that chocolate can slow the progression of colon cancer in rats! In short, chocolate has a surprising number of health benefits to enjoy.
Despite the small health benefits of chocolate, it’s important to consider the significant drawbacks. Its high sugar and fat content can lead to obesity. One study found that for postmenopausal women, every 1oz of chocolate consumed per week resulted in an additional 1kg of weight gain over the course of 3 years. The more chocolate consumed, the greater the weight gain over time. This is concerning because obesity can lead to a host of health issues such as diabetes, hypertension, heart disease, cancer, cerebrovascular disease, and stroke.
While rats may benefit from chocolate in the fight against colon cancer, in humans, chocolate consumption appears to be linked to an increased risk of prostate cancer. As with many foods, chocolate has both positive and negative effects, depending on the context.
Alice H. Lichtenstein, a professor of nutrition science and policy at Tufts University, summed it up well when she said, “If you enjoy chocolate, the important thing is to choose the type you like the most and eat it in moderation because you enjoy it, not because you think it’s good for you.”
5. Self-Control: Depletable or Not?

The theory of ego depletion, widely tested in psychology, posits that self-control is a limited resource that can be exhausted. In the study that introduced this idea, students participated in several tasks requiring self-control. They were shown two types of food: radishes and chocolate cookies. The paper notes, “The chocolate chip cookies were baked in the room in a small oven, and as a result, the laboratory was filled with the delicious aroma of fresh chocolate and baking.” Some participants were told to eat only the radishes, some could only eat the cookies, and another group had no food at all. The radish-only group had to exert self-control to resist eating the cookies. Afterward, they were given an unsolvable puzzle, but they weren’t told it was impossible. They were provided with a bell to signal when they wanted to give up. Continuing with the puzzle required significant self-control, as the task offered no reward.
The study ultimately revealed that participants who first had to exercise self-control by eating only radishes and avoiding the tempting cookies were more likely to give up earlier on the impossible puzzle. This suggests that their self-control had been somewhat depleted by the initial task, leaving them with less to rely on during the second. This concept of ego depletion has been replicated across various conditions. Some studies showed that making people make purchase decisions or engaging in racial politics could deplete their self-control. Other labs even tested for ego depletion in dogs and found evidence of it.
In contrast, a more recent study aimed to definitively test ego depletion with a task that required self-control but wasn’t influenced by factors like personal preferences (such as someone disliking chocolate chip cookies) or cultural influences. The study involved 24 labs from countries like Australia, Belgium, Canada, France, Germany, Indonesia, the Netherlands, New Zealand, Sweden, Switzerland, and the United States. Instead of food, participants played digital games that required impulse control to find correct answers. This study concluded that there was no significant performance drop from task to task that could be linked to self-control depletion.
4. Red Meat: Unhealthy, But We’re Not Quite Sure Yet

From a summer barbecue rack of ribs to a hotdog at a baseball game, red meat is a dietary staple for many, with its rich flavor elevating meals to a new level. But despite its popularity, red meat has always been met with caution by the scientific community. Studies have shown that processed red meats, such as hotdogs, increase the risk of glioma, a brain and spinal cord tumor. Other research has linked red meat consumption to higher risks of colorectal cancer. Additionally, red meat contains trimethylamine N-oxide, a substance tied to heart disease. These findings have led health organizations to recommend limiting red meat, particularly processed varieties.
However, a recent and controversial meta-analysis published in the *Annals of Internal Medicine* examined multiple studies on the topic and argued that there isn’t enough scientific evidence to justify the widespread advice to cut back on red meat. The authors concluded that the evidence for the potential harmful health effects of meat consumption was “low to very low” and that reducing red or processed meat by three servings a week resulted in “very small and often trivial” reductions in health risks. Their conclusion wasn’t that red meat is healthy, but rather that the proof is still insufficient to recommend cutting it for health reasons.
3. Eggs: Do They or Don’t They Contribute to Cardiovascular Disease?

Eggs are a dietary staple for much of the global population, with 73% of adults consuming them regularly. Given their widespread presence in our diets, the health effects of eggs have been extensively studied. However, the scientific findings regarding their impact remain inconclusive.
One of the most significant health concerns surrounding eggs is the 185 milligrams of cholesterol found in the yolk. Certain types of cholesterol are known to raise the risk of heart disease, and some studies support this worry. For instance, a 2019 study that followed participants over 17.5 years revealed that each additional half egg consumed daily raised the chances of developing cardiovascular disease by 6%, and overall mortality by 8%.
Yet, the science is not always so clear-cut. In the same year, another study examining eggs and cardiovascular risk found no statistically significant connection. Maria Luz Fernandez, a professor of nutritional sciences at the University of Connecticut and an author of the study, explained that while eggs contain high levels of cholesterol, they are low in saturated fat. She remarked, 'While the cholesterol in eggs is much higher than in meat and other animal products, saturated fat increases blood cholesterol. This has been demonstrated by lots of studies for many years.'
In other words, the cholesterol found in eggs might not be the culprit we once believed it to be.
'There are systems in place so that, for most people, dietary cholesterol isn’t a problem,' said Elizabeth Johnson, research associate professor of nutritional sciences at Tufts University.
2. The Perks and Pitfalls of Waking Up Early

'The early bird gets the worm' is a familiar saying often spoken by those who rise before the sun, fully awake and productive. But there’s evidence supporting their habit. One study published in the Journal of Applied Social Psychology surveyed 367 university students about their sleep patterns and productivity. The questions included statements like, 'I spend time identifying long-range goals for myself' and 'I feel in charge of making things happen.'
The results showed that 'morning people' were indeed more proactive than those who favored late nights. Those who had small variations in their wake times between weekdays and weekends also displayed more proactive behaviors. The survey's author remarked, 'When it comes to business success, morning people hold the important cards. My earlier research showed that they tend to get better grades in school, which get them into better colleges, which then lead to better job opportunities. Morning people also anticipate problems and try to minimize them, my survey showed. They’re proactive. A number of studies have linked this trait, proactivity, with better job performance, greater career success, and higher wages.'
However, a contrasting study published in The Journal of Clinical Endocrinology & Metabolism, which included 447 men and women aged 30-54 who worked at least 25 hours a week outside their homes, found that their early rising didn't align with their natural circadian rhythms.
Patricia M. Wong, MS, from the University of Pittsburgh, commented on this study, saying, 'Social jetlag refers to the misalignment between an individual’s internal biological circadian rhythm and their externally imposed sleep schedule. Some previous studies have linked social jetlag with obesity and certain cardiovascular issues. However, this research is the first to build on that work, revealing that even in healthy, working adults with less extreme mismatches in sleep schedules, social jetlag can still lead to metabolic problems. These metabolic issues can ultimately contribute to obesity, diabetes, and cardiovascular diseases.'
1. Video Games: Boosting or Hindering Children’s Social Skills

The age-old frustration for gamers is hearing the saying, 'video games rot your brain.' This claim has persisted since the early days of video games, but it was rarely backed by any scientific evidence. Over time, however, researchers began focusing on studying the effects of video games on children. One study published in Social Psychiatry and Psychiatric Epidemiology focused on children aged 6 to 11. The research measured how much time each child spent playing video games daily and compared this to data from questionnaires completed by their parents, teachers, and the children themselves. The study also evaluated the children’s academic performance. After adjusting for certain factors, the study found that more video game play was associated with a 1.75 times greater likelihood of high intellectual functioning, 1.88 times higher odds of strong overall school performance, and fewer relationship issues with peers.
Katherine M. Keyes, PhD, assistant professor of Epidemiology at the Mailman School of Public Health, remarked on the findings, stating, 'Video game playing is often a collaborative leisure activity for school-aged children. These results suggest that children who play video games regularly may develop stronger social bonds with their peers and be better integrated into their school communities. However, we caution against over-interpreting these findings, as setting appropriate limits on screen time remains a crucial part of parental responsibility for ensuring overall student success.'
She was right to emphasize parental involvement in managing screen time, as another study examined the long-term impact of video games on children over six years, starting when the children were six years old. This study followed 873 Norwegian schoolchildren, with their parents reporting their gaming habits every two years. Teachers also evaluated the children's social skills based on factors like following instructions, behavior control, and confidence in social settings.
The findings revealed that poor social performance was associated with an increase in video game usage in the future, but video gaming itself didn’t necessarily result in lower social skills over time, except for one specific group. Ten-year-old girls who spent a significant amount of time playing video games were found to be less socially adept by the age of 12 compared to those who didn’t play as much. While video games have proven beneficial for many, they don’t provide universal benefits. For some individuals, they can enhance social skills, but for others, they may have a diminishing effect.
