
Some scientific concepts are quite intricate, and we typically encounter them in school only once we’re ready to comprehend the complexities. Other ideas appear simple—things our prehistoric ancestors may have easily accepted. However, many of these notions took an unexpectedly long time to fully understand—germs, vitamins, and continental drift are just a few examples. It wasn't until the last century that humanity truly grasped these ideas.
It wasn't until the 1960s that scientists reached a consensus on plate tectonics.

The continents move slowly across the Earth's surface. It's gradual, but there’s a wealth of evidence (literally) proving that they do. When tectonic plates collide, mountains and earthquakes form. When they separate, we’re left with continents that seem like scattered puzzle pieces, each with matching fossils in areas that were once connected. However, this theory wasn’t convincing to scientists until the 1960s, when enough data finally emerged to develop a credible explanation for how the plates shift.
It took a long time for vitamins to be properly identified

In the 1700s, European sailors supposedly realized that carrying citrus fruits was key to avoiding scurvy. But you might be surprised to learn that the very idea of a vitamin wasn’t established until the 20th century. Even though there was a well-known experiment that should have shown these fruits were essential, scientists of the time interpreted the results as proof that scurvy was due to an imbalance in the body’s “humors,” not a missing nutrient in the diet. It wasn’t until 1912 that a paper suggested some nutritional diseases, like scurvy, “can be prevented and cured by the addition of certain preventive substances,” which the author named “vitamines.”
We understood heartbeats, but the concept of blood circulation came much later.

This discovery is one of the older ones on our list, tracing back to the 1600s (at least in European medicine), which is still surprisingly late given that people had been exploring the insides of animals and injured warriors for centuries before.
While the ancient Greeks and medieval Europeans recognized that the heart beats, they believed it was simply pumping blood from the liver, where it was supposedly created, to the body tissues, where it was consumed. It wasn’t until William Harvey published a book in 1628 that he proposed blood actually circulates back from the body tissues to the heart, explaining the role of both arteries and veins.
Uranus, Neptune, and Pluto came much later on the scene

Before telescopes were invented, we could only observe five planets: Mercury, Venus, Mars, Jupiter, and Saturn. Uranus was first spotted in 1781, and Neptune in 1846. (Pluto wasn’t discovered until 1930.)
We haven’t known about dinosaurs for very long

While dinosaur bones have been buried in the earth for far longer than humans have roamed the surface, it took us a while to figure out what they actually were. Throughout history, people have mistakenly thought they were the remains of extinct giant humans, Biblical creatures, or, in some Asian cultures, dragon bones. (The dragon bone theory? Actually not as far-fetched as it sounds.) The first scientific descriptions of dinosaurs, based on the skeletons of Megalosaurus and Iguanodon, were published in 1824 and 1825. The word “dinosaur” itself was coined in 1841.
Time zones are a relatively recent development

Before long-distance phone calls and international flights became possible, there was little reason to know the time in other parts of the world. Time is essentially local: The sun rises and sets here a little earlier or later than it does in the next town, and if there’s a clock tower or church bells, there’s no necessity for neighboring towns to sync their time with ours.
The idea of coordinating clocks and establishing time zones didn’t catch on in the U.S. until the 1880s. It wasn’t a nationwide standard at first, but more of a scheduling system adopted by the railroads. The U.S. government took control of time zones through the same 1918 law that created daylight saving time. Countries handle time zones differently; for instance, China uses a single time zone across the entire country.
Germs were once mocked

The world is teeming with tiny organisms that can make us ill, and we now accept this as a fact of life. If your friend has COVID and coughs near you, there's a good chance you'll catch it. If you eat food that's been dropped on the floor (even if it's been more than five seconds), you might end up sick. And if you're having surgery, your surgeon better wash their hands thoroughly.
But the idea that something invisible could cause illness took a long time to develop, let alone gain widespread acceptance. The concept of harmful, rotten things being bad for you was understood, but it didn’t reach the core truth that there are invisible threats that can make you ill. People had theories like avoiding bad smells or inhaling perfume-soaked rags to fend off disease. Even when death rates dropped dramatically after a hand-washing policy was introduced in a maternity ward in 1846, it didn’t fully catch on. The inventor of the microscope who claimed to have seen “animalcules” swimming around in semen and dental plaque was laughed at.
Eventually, Louis Pasteur's work in the 1860s began to shift public opinion, leading to the development of germ theory. Diseases started being linked to specific bacteria, with one of the first breakthroughs being the identification of the anthrax bacillus in 1876.
Viruses were discovered much later

While we typically think of “germs” as bacteria, bacteria were the first germs to be identified. Viruses were a different challenge, as they can't usually be seen with a light microscope. The word “virus” was first used in 1892 for a liquid that could be extracted from sick tobacco plants and infect healthy ones, even after it passed through a filter fine enough to remove bacteria. For decades, viruses were recognized for what they could do, but scientists debated what, if anything, they actually are. It wasn’t until 1940 that the first images of viruses were taken using an electron microscope, showing that they were solid, tangible objects with distinct parts.
The discovery of blood types occurred not too long ago.

For centuries, blood has been recognized as essential to life—if you lose too much, it’s fatal. But could you actually transfuse blood into someone? Initially, the idea seemed promising, but early attempts often failed. In many cases, patients would tragically die. It wasn’t until around 1900 that physicians discovered some people’s blood would coagulate when mixed, while others could combine their blood without any issue. This breakthrough highlighted the need for blood transfusions to be compatible between the donor and recipient.
Transfusions only became standard practice once medical professionals learned how to preserve blood without it clotting. This innovation also meant that blood donations could be counted and regulated, rather than simply linking two individuals' veins and hoping for the best.
CPR didn’t become a formalized procedure until 1960.

Cardiopulmonary resuscitation, a lifesaving method used when someone's heart is not functioning properly, wasn’t established as a formal set of recognized techniques until 1960. Prior to that, mouth-to-mouth resuscitation was suggested for drowning victims, and chest compressions were sometimes employed.
Initially, CPR was often referred to as “closed-chest” compressions to distinguish it from the then-alternative, open cardiac massage, where surgeons would open the ribcage to directly massage the heart. Luckily, closed-chest CPR proved to be a more viable and effective method.
