When people head to the polls, they often evaluate a candidate's moral values. Explore more images from elections and U.S. Presidents. Patti McConville/Photographer's Choice/Getty ImagesDuring election cycles, voters assess the moral viewpoints of candidates, wondering how their perspectives align with their own. Certain issues, such as stem cell research, abortion, and gay marriage, often dominate morality discussions. However, topics like foreign policy and war, though seemingly political, have moral implications too. For example, would a candidate authorize bombing an enemy city if their child were stationed there? Would they consider that every soldier is someone's loved one, someone’s parent, sibling, or spouse?
In an extreme hypothetical, would it be acceptable to ask a candidate whether they'd smother a baby? This might sound horrendous, but consider this scenario: during wartime, a group hides in a basement, and the baby cries, risking their discovery. If the baby is silenced, the group survives; if not, everyone perishes, including the child. Would a candidate make such a choice to save the group? It's a question that pushes the limits of morality.
While it may seem rational to sacrifice the baby for the group's survival, could you personally bring yourself to take that action? Would you want a president capable of such decisions? It turns out that we may not have much choice in how we answer such moral questions. For centuries, morality was explored by philosophers, theologians, and others, but now neuroscientists are delving into the brain’s role in moral decision-making. As some experts suggest, the answer may lie within the brain, that vital organ zombies crave. What happens in our brain when faced with moral dilemmas, and can morality be pinpointed to a specific location in the brain?
How Do You Choose? Moral Dilemmas and the Brain
Would you take a life to save many? The brain reacts differently depending on who is carrying out the action.
Peter Griffith/Taxi/Getty ImagesIn 2001, a research team led by philosopher and neuroscientist Joshua Greene published a paper exploring the use of functional MRI to examine the brains of individuals facing a moral dilemma. Greene and his colleagues aimed to determine if there was a clash between the emotional and reasoning centers of the brain.
Participants in the study were given a scenario involving the choice to kill one person to save a group, such as the dilemma involving the crying baby mentioned earlier. During this mental struggle, various parts of the brain activated, including two areas in the frontal lobe. The scans revealed activity in the emotional region of the frontal lobe, which processes our feelings towards others, and in the cognitive area responsible for tasks like reasoning [source: Pinker]. Moreover, the anterior cingulate cortex, which detects conflict in the brain, was also engaged. This indicates that individuals weighed the moral cost of taking an innocent life against the benefit of saving many.
Then the participants faced a version of the dilemma where they wouldn't need to be personally involved in the killing. The same individual would die, but someone else would carry out the act or a lever could be pulled to make it happen. In this case, only the rational part of the brain was active, suggesting that when emotional involvement was removed, people simply performed a utilitarian calculation to determine the best outcome for the group.
In a 2007 study, researchers from various universities sought to explore how different brain regions influence morality and what might occur if those regions were impaired. The study was relatively small, consisting of 12 individuals with no brain damage, 12 with brain damage in areas controlling emotions like fear, and 6 with damage to the ventromedial prefrontal cortex, believed to govern emotions such as shame, empathy, compassion, and guilt [source: Gellene]. The subjects were presented with 50 hypothetical scenarios, some involving moral decisions, others not.
There was a significant amount of overlap in the groups' responses to certain scenarios. For situations that didn’t require a moral choice, all groups gave the same answer. In cases where a moral decision wasn’t about harming others, like whether it was acceptable to list personal expenses as business expenses for a tax write-off, all groups showed some flexibility with the rules. However, when asked about more extreme moral dilemmas, such as whether it was acceptable to harm or kill someone for the greater good, a key difference emerged. Those with damage to the ventromedial prefrontal cortex were roughly two to three times more likely to endorse sacrificing one person for the benefit of many [source: Saletan].
This suggests that when the brain's emotional centers, like those involved with empathy and shame, are compromised, individuals tend to focus more on the cost-benefit analysis of the greater good, disregarding emotional factors. This raises concerns about the potential consequences of such findings. Could knowing that a person’s brain is damaged in this manner influence criminal trials? Could a defense based on "damage to the ventromedial prefrontal cortex" become a common plea in court?
That might seem unlikely, given that different cultures have varying definitions of what constitutes a crime. If morality is hardwired into the brain, why do we all hold such different moral views? Turn to the next page for some of the leading theories.
The Moral Systems in Us All
Children from different cultures will develop varying moral systems.
iStockphoto.com/Jani BrysonWe might quickly attribute differences in moral perception to cultural influences or religious upbringing. However, some scientists argue that morality is embedded in our brains and is simply shaped by external factors. One such scientist, Marc Hauser, draws on anthropology and linguistics to demonstrate that morality existed long before the advent of religion.
Anthropology comes into play when we observe that primates, like apes and monkeys, display behaviors that suggest moral tendencies, such as forgoing food to avoid harming another primate [source: Wade]. While we can't know for certain what motivates these actions, they may provide insight into the foundational role morality plays in communal living—a trait humans would eventually perfect. Hauser’s breakthrough was in connecting morality to language.
In the 1950s, linguist Noam Chomsky proposed that we are born with an inherent grasp of grammar, but within each language, there are unique rules and idiosyncrasies. Hauser believes that morality works in a similar way. We are born with core moral principles, such as 'do no harm,' but these principles are shaped by our environment. Hauser suggests that this unconscious wiring may be due to the time pressures we face. If we had to deliberate over every verb, noun, and sentence structure when we spoke, we’d never get anything done. Similarly, we don't have the luxury of debating moral dilemmas each time they arise. Just as we instinctively recognize a grammatical error, we subconsciously know when something feels right or wrong [source: Glausiusz].
Psychologist Jonathan Haidt has identified the moral systems that may be inherent within each individual:
- Prevention of harm to others
- Reciprocity and fairness
- Loyalty to one's group
- Respect for authority
- Sense of purity and sanctity
[source: Pinker]
These inherent moral systems might have provided an evolutionary advantage. For instance, the concept of purity might have developed when people had to decide which mates were the best fit and which foods were safest to consume. Furthermore, finding a group that shared your beliefs would improve personal survival, as that group would be more likely to support you in times of need. The survival of the group as a whole was also reinforced by the last three principles.
Different cultures can shape these moral systems, which explains how people can view the same situation but reach different conclusions. We all possess these five principles, but we emphasize certain ones more depending on how we were raised. Take honor killings, for example, where a woman is killed for committing adultery or even for speaking to a man who is not her husband in public. Some Middle Eastern cultures view these actions as clear violations in the areas of respect for authority and purity, while other Western cultures primarily see the woman's death as a wrongful harm to another person.
Haidt suggests that sometimes we aren't even aware of how our culture has ingrained these concepts in us [source: Wade]. This is because the more rational part of the brain, the one that activated in Joshua Greene’s brain scans (described on the previous page), may have evolved later than the emotional side that governs our sense of right and wrong. These two brain systems could be in conflict, with the rational side trying to understand the emotional side's reaction. When the rational side can't make sense of the emotional side's response, it leads to what Haidt calls moral dumbfounding [source: Wade]. Sometimes, we simply know something is right or wrong without being able to explain why.
As you might anticipate, some philosophers are not happy with scientists stepping into their domain [source: Wade]. Both groups still need to sort out what these discoveries could mean for our brains and for society as a whole. However, one thing that requires no further debate is the choice to turn to the next page, where you'll find much more information about morality and the brain.
