The term 'Law' can have multiple interpretations in the contemporary world. One form refers to rules enforced by authorities, which can be strict, established, and contested (although, as some singers have suggested, challenging them is rarely advisable). People set their own consequences for actions, but as long as these are consistent, they form rules that endure.
On the flip side, there are scientific laws. These are principles that should remain consistent, regardless of human influence. However, the term 'law' is also applied to principles that appear to hold true, even if they haven’t been rigorously validated. These are often named after the individuals who first recognized them. Technologists like Bill Gates, authors such as Arthur C. Clarke, and many outside the strict boundaries of law and science have had their names attached to such so-called laws. Continue reading to discover what these figures observed and how people use these principles to forecast outcomes and make decisions.
10. Betteridge's Law of Headlines

The website Popular Science once posed the question, 'Do Pineapples Make Great iPhone Cases?' in a headline. We can all probably guess the answer (No, for those who were wondering). While this example is clear-cut, not all yes-or-no questions in headlines are so easy to answer. Or are they?
British technology journalist Ian Betteridge suggests that assuming the answer is 'no' to any question headline usually provides the correct response. This principle has come to be known as 'Betteridge’s Law of Headlines.' He argues that it works because such headlines are often used to publish stories that may lack factual accuracy or reliable sources.
Betteridge introduced this idea in a 2009 article criticizing a technology news site for spreading a false rumor by using a question-based headline. However, he was not the first to recognize this pattern. Veteran British journalist Andrew Marr had already advised readers to 'try answering ‘no’’ to question headlines back in 2004.
9. Clarke’s First Law

While Betteridge’s Law of Headlines serves as a helpful guideline, most people today are aware of the prevalence of misinformation and fake news. Even with skepticism toward journalists, it's still difficult not to trust a respected, senior scientist when they speak.
Arthur C. Clarke, the science fiction writer, argued that when it came to forecasting future scientific breakthroughs, scientists often got things wrong. They were notably consistent in their errors, as explained in the first of Clarke’s three laws: 'When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.'
This concept was introduced in a 1962 essay where it was argued that a lack of imagination led to poor predictions about the future. In 1977, Isaac Asimov, a well-known science fiction author, suggested a modification to Clarke’s law. He believed that when the public strongly backs an idea that older scientists reject, those scientists are more likely to be right.
8. Cunningham’s Law

Betteridge’s and Clarke’s laws guide us on when to trust information and when to be cautious. Cunningham’s law shares similarities with them but is more hands-on and solution-oriented. It can help individuals gather information they need from others on the internet, and some claim it works even better than directly asking.
Cunningham’s law posits that a false statement is more likely to be corrected than a direct question would be to receive an answer. Therefore, when seeking knowledge, the most effective method is to make a confident yet entirely false claim about a topic. Then, just wait and watch as the corrections come in.
This principle is named after Ward Cunningham, a key figure in developing user-edited ‘wiki’ websites. However, Cunningham was not the first to apply this strategy. The ancient Greek philosopher Socrates often began his famous dialogues with arguments that he knew to be wrong.
7. Andy and Bill’s Law

The rise of computers has led to the development of many so-called laws, in addition to Cunningham’s. Perhaps the most famous of these is Moore’s law, which predicts that the number of transistors on computer chips will approximately double every two years. In essence, this means computers will continue to grow more efficient—processing faster and at a lower cost—at an astonishing rate. Named after Gordon Moore, co-founder of Intel, this law is just one example of how Intel executives have shaped the industry.
Andy Grove, the former CEO of Intel, shares a law with another tech visionary, Bill Gates, the founder of Microsoft. This is known as 'Andy and Bill’s Law,' and it states that the progress made by chip manufacturers like Intel will eventually be offset by software developers like Microsoft, who will expand their software to fully utilize the computing power available.
A popular joke at computer conferences goes, 'What Andy giveth, Bill taketh away.' In some versions, it is Gordon Moore who 'giveth,' but the humor underscores the remarkable influence that Andy and Bill’s law has had on the tech world. The continuous innovation from both sides is why today’s smartphones hold more computing power than the spacecraft that took humans to the Moon in 1969.
6. Eroom’s Law

Since its introduction in 1965, Moore’s Law has generally held true. This principle has fostered remarkable advancements in technology, which, in theory, should spur exponential progress across other fields. Nevertheless, a 2012 study examining whether technological gains had accelerated the development of new drugs found the opposite trend. The rate of new drug approvals for each billion dollars spent on research has dropped by half approximately every nine years since 1950.
As a result, the cost to develop a new drug has effectively doubled every nine years. This phenomenon was dubbed ‘Eroom’s Law,’ a reverse of Moore’s Law. One possible reason for this, according to the researchers, is what they termed the 'better than the Beatles problem.' This suggests that the bar for improvement has been set so high that it’s akin to the expectation that musicians would need to surpass the Beatles' greatness to succeed, a pressure that stifles creativity and progress.
5. Goodhart’s Law

Goodhart’s Law is succinctly expressed as 'when a measure becomes a target, it ceases to be a good measure.' This rephrasing, which is more concise than the original, was offered by anthropologist Marilyn Strathern. The law is named after British economist Charles Goodhart, and one clear illustration of it comes from a rat control initiative in Vietnam during the early 20th century.
In Hanoi, rat catchers were tasked with eliminating the city’s rat population. To ensure they were doing their job, officials required the rat catchers to present the tails of the rats they had killed. The officials would count these tails as a way of measuring the number of rats removed. However, the situation changed when the authorities decided to pay the rat catchers based on the number of tails they submitted.
Soon after, it became a goal in itself. The rat catchers ceased hunting the rats and began simply cutting off their tails. This strategy allowed them to continue earning money, and the rats were able to reproduce, ensuring a steady supply of tails for the next day's work. As a result, counting tails no longer served as a valid measure of how effective the catchers were at controlling the rat population.
4. Segal’s Law

Some so-called laws are really more like sayings or pieces of wisdom. Segal’s Law is one such example. It states that “a man with one watch knows what time it is; a man with two is never sure.” On its surface, the law highlights the problems that arise when one has multiple sources of information. When those sources contradict each other, it becomes impossible to rely on any of them, leaving individuals unsure of which one is accurate. Therefore, having a single source of information is simpler, but that also leaves you vulnerable to not knowing when it’s wrong.
The first known appearance of this saying was in a 1930 issue of a San Diego newspaper, where it was used as a light-hearted filler. The phrase is named after Lee Segall, a Texas radio host, who was mistakenly credited with the quote in a well-known book titled *Murphy’s Law*. Notably, the author also misspelled Segall’s last name. While Segal’s Law has been attributed to figures like Mark Twain and Albert Einstein, there is no evidence to support that either of them originated the saying.
3. Hofstadter’s Law

The reasons behind Benford’s Law of Controversy may be subconscious, but once people recognize it, they can begin to reflect on it and notice how their emotions might be filling in the gaps. However, attempting to circumvent this law is ultimately futile. This is because, according to Hofstadter’s Law, knowing about it won’t allow anyone to avoid its effects.
The self-referential law, proposed by cognitive scientist Douglas Hofstadter, asserts that planned tasks will always take longer than people anticipate, even if they take Hofstadter’s law into account. People already anticipate delays, so they adjust their expectations to allow even more time. But as Hofstadter points out, they will still be caught off guard by how long things actually take.
Perhaps the most notable example of this law in practice is the prolonged disaster of the Sydney Opera House project. Similarly, London’s Wembley Stadium was expected to open in 2003, 2005, and 2006 before it finally opened in 2007.
2. Benford’s Law of Controversy

Although it shares the name of a physicist named Benford, the so-called 'Benford’s Law of Controversy' is entirely unrelated to Benford’s Law. The Benford behind this law is actually an astrophysicist named Gregory Benford, and his principle states that 'passion is inversely proportional to the amount of real information available.' When factual information is scarce, people tend to fill in the void with theories or rumors. They often select what they wish to be true or what aligns with their tribal identity.
This explains why people can become so passionate about topics they know little about. While this tendency affects some individuals more than others, it is something that anyone can experience. The root cause of this is that uncertainty is uncomfortable. Constructing a narrative, even if it's incorrect, provides a sense of relief and comfort.
1. Benford’s Law

Imagine you take a pile of newspapers and note the first digit of each number you find in them. You might assume that each possible first digit would appear approximately the same number of times. However, Benford’s law suggests that this is not the case. Lower digits actually appear far more frequently than higher ones.
The phenomenon was first discovered by astronomer Simon Newcomb in 1881. While browsing through a book of mathematical tables in a library, he noticed that the pages toward the beginning of the book were much dirtier than those at the end. It seemed his colleagues were more likely to use numbers starting with 1.
In 1938, American physicist Frank Benford expanded on this observation, testing it with thousands of data points. He confirmed that smaller first digits indeed occur more often, and since then, Benford’s law has been observed in various fields, from electricity bills and street addresses to stock prices and population statistics. Today, Benford’s law is even used as a tool for detecting fraud, as people attempting to fabricate numbers often try to distribute the digits evenly, causing their data to deviate from Benford’s Law.
