"Brain's 'Cheat Sheet' Makes Moral Decisions Easier"
"Brain's 'Cheat Sheet' Makes Moral Decisions Easier"
by Stephanie Pappas
by Stephanie Pappas
“How much would someone have to pay you to switch from drinking coffee every morning to drinking tea? How about to rescind the almost-universal belief that murder is wrong and then kill an innocent person? Most likely, your brain processed those two questions in very different ways, a new study finds. People weigh questions of sacred values - such as "don't murder" - in different brain regions than they do mundane preferences. These special brain regions seem to be those associated with recalling rules, suggesting that we don't weigh the costs and benefits when asked to do something against our most firmly held values. Instead, we fall back on a mental "cheat sheet" of right and wrong. "If you had to do cost-benefit calculations for everything you do in your daily life, you wouldn't be able to come to any decisions at all," said study researcher Gregory Berns, director of the Center for Neuropolicy at Emory University. "So rules actually have the benefit of making decision-making much easier... you just look up in your own personal 'rule table' how to act."
Making moral choices: Though the vast majority of people can agree that killing someone is bad, there are two main ways to come to that conclusion, Berns told LiveScience. You might take a utilitarian approach, figuring that whatever benefit would come from the murder would be outweighed by the costs in risk of punishment or pain to the victim's family. Alternatively, you might take a rule-based, or "deontological," approach. This is the "Ten Commandments" line of reasoning, Berns said: Murder is wrong, because it's wrong, and that's that.
Figuring out which approach people really take is tough, though. You can ask them in a survey, but they might respond with what they think you want to hear. It's not even easy to figure out which values people hold sacred; after all, you can't ask someone to kill an innocent person in a psychology lab and then wait to see whether they do it or not. So Berns and his colleagues got creative. Instead of measuring people's willingness to break their sacred values, they measured their willingness to take money to sign a document announcing that they believed the opposite of what they really believed. "The idea is, if you feel really strongly about something, there is no amount of money that will make you say otherwise," Berns said.
Selling out: First, the researchers placed 32 participants in a functional magnetic resonance imaging (fMRI) scanner, which measures blood flow in the brain, creating a picture of which brain regions are active at any given time. As the machine ran, the researchers read a series of value statements to the participants. Some were mundane, such as "You are a cat person." Others were meant to get at sacred values, such as "You believe in God," "You do not believe in God, "You would sell a child," or "You would have sex with a 4-year-old." Although read in random order, each statement was paired with an opposite statement. So participants heard "You are a cat person" as well as "You are a dog person." For some statements, the exact opposite was given: "You would sell a child" and "You would not sell a child." Other examples included: "All Jews should/should not have been killed in WWII," "I believe/don't believe in God," and "North Korea should/should not be nuked."
In the next task, the participants heard the statements again, with opposites one after another. This time, they had to pick which of each pair was true for them. Next, the researchers asked how much money, between $1 and $100, a person would take to rescind those statements in a signed document. They could also opt out of this auction completely. A cat person who said they'd take a dollar to call themselves dog people obviously does not view that belief as sacred. In contrast, someone who insisted that no amount of money would make them say "I would sell a child" clearly holds that value dear. To make the stakes real, the participants got actual money for selling out their values. After they named their prices, they rolled a 10-sided die. If the numbers rolled came in higher than their price to rescind a particular value, they got paid. They then had to sign a personalized document saying what they'd sold out.
Making rules: There was a broad range of what people were willing to sell out, with the firmest-believing participant opting out of auctioning all but 8 percent of his (or her) beliefs. Some people named a price for everything on the list, though the average was about half. Those values that people refused to sell out were considered to be sacred. The participants then went back to the brain scans. It turned out that the values later shown to be sacred were the ones that activated two particular brain regions: the left temporoparietal junction (TPJ) and the ventrolateral prefrontal cortex. The TPJ is the point where the temporal and parietal lobes of the brain meet on the side of the head, while the ventrolateral prefrontal cortex is on the underside of the frontal lobe. Both of these areas are associated with rule retrieval and beliefs about right and wrong.
"When people engage sacred values in their thought processes, they are by and large using rule-based systems in their heads," Berns said. "They're not using cost-benefit calculations." This makes sense, given how inefficient it would be to weigh the pros and cons of every moral decision, he said. "It's much easier just to fall back on well-worn rules that serve you well, and serve society well," Berns said. The downside to rules is that people loathe breaking them, even when the rules are based on faulty experiences or information. "Once a rule is in someone's head, it's going to be hard to change it, even if there is a mountain of evidence saying that it's not a good rule," Berns said.
Gray areas: Of course, not everyone's sacred values are the same. Almost no one considered a preference for coffee over tea to be sacred; likewise, pretty much everyone held that sexually assaulting a child is horribly wrong. But there are plenty of values that fall into gray areas. Some people held their belief in God or the belief that abortion is wrong as sacred values. Others held the opposite viewpoints as just as sacred, or just didn't feel that strongly either way.
Interestingly, the people who tended to hold their sacred values most strongly, those with the biggest brain response differences between sacred- and non-sacred processing, also tended to be those who participated in the most group activities, Berns said. The groups could be anything from religious organizations to sports teams to professional societies, he said. The researchers are now continuing studies to find out how group conformity might play a role in sacred values. "We don't know the direction of causality there, but if I had to speculate it would be that groups are the mechanisms that our culture uses to transmit and instill these rules," Berns said. "It stands to reason that the more involved you are with groups, the stronger the rules become." The researchers reported their findings this week in the journal 'Philosophical Transactions of the Royal Society.'"
- http://www.sott.net/
0 Response to ""Brain's 'Cheat Sheet' Makes Moral Decisions Easier""
Post a Comment