Why good people are divided by politics and religion Jonathan Haidt
failed to fetch data:
See also Politics, Odors and Soap.
Here's a more challenging story: “A man goes to the supermarket once a week and buys a chicken. But before cooking the chicken, he has sexual intercourse with it. Then he cooks it and eats it.”
By using a framework that predefined morality as justice while denigrating authority, hierarchy and tradition, it was inevitable that the research would support worldviews that were secular, questioning and egalitarian.
One of my disrespect stories was: “A woman is cleaning out her closet, and she finds her old American flag. She doesn't want the flag anymore, so she cuts it up into pieces and uses the rags to clean her bathroom.”
I had trained my interviewers to correct people gently when they made claims that contradicted the test of the story. For example, if someone said, “It's wrong to cut up the flag because a neighbor might see her do it, and he might be offended,” the interviewer replied, “Well, it says here in the story that nobody saw her do it. So would you still say it was wrong for her to cut up the flag?” Yet even when subjects recognized that their victim claims were bogus, they still refused to say that the act was OK. Instead, they kept searching for another victim. They said things like “I know it's wrong, but I just can't think of a reason why.” They seemed to be morally dumfounded—rendered speechless by their inability to explain verbally what they knew intuitively.
These subjects were reasoning. They were working quite hard at reasoning in support of their emotional reactions. It was reasoning as described by the philosopher David Hume, who wrote in 1793 that “reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them.”
We're born to be righteous, but we have to learn what, exactly, people like us should be righteous about.
[Dale] Carnegie was in fact a brilliant moral psychologist who grasped one of the deepest truths about conflict. He used a quotation from Henry Ford to express it: “If there is any one secret of success it lies in the ability to get the other person's point of view and see things from their angle as well as your own.”
It's such an obvious point, yet few of us apply it in moral and political arguments because our righteous minds so readily shift into combat mode. The rider and elephant work together smoothly to fend off attacks and lob rhetorical grenades of our own. The performance may impress out friends and show allies that we are committed members of the team, but no matter how good out logic, it's not going to change the minds of our opponents if they are in combat mode too. […] Empathy is an antidote to righteousness, although it's very difficult to empathize across a moral divide.
The intuitive nature of political judgments is even more striking in the work of Alex Todorov. […] He collected photographs of the winners and runners-up in hundreds of elections for the U.S. Senate and House of Representatives. He showed people the pairs of photographs from each contest with no information about political party, and he asked them to pick the person judged most competent. He found that the candidate that people judged more competent was the one who actually won the race about two-thirds of the time. People's snap judgments of the candidate's physical attractiveness and overall likeability were not as good predictors of victory […] We can have multiple intuitions arising simultaneously, each on processing a different kind of information.
And strangely, when Todorov forced people to make there competence judgments after flashing the pair of pictures on the screen for just a tenth of a second—not long enough to let their eyes fixate on each image—their snap judgments of competence predicted the real outcomes just as well.
One way to reach the elephant is through its trunk. The olfactory nerve carries signals about odors to the insular cortex, a region along the bottom surface of the frontal part of the brain. This part of the brain used to be known as the “gustatory cortex” because in all mammals it processes information from the nose and from the tongue. It helps guide the animal toward the right foods and away from the wrong ones. But in humans, this ancient food-processing center has taken on new duties, and it now guides our taste in people. It gets more active when we see something morally fishy, particularly something disgusting, as well as garden-variety unfairness. […]
Alex Jordan, a grad student at Stanford, came up with the idea of asking people to make moral judgments while he secretly tripped their disgust alarms. [using fart spray] He stood at a pedestrian intersection on the Stanford campus and asked passersby to fill out a short survey. It asked people to make judgments about four controversial issues, such as marriage between first cousins […] Before half of the people walked up (and before they could see him), he sprayed the fart spray, which “perfumed” the whole intersection for a few minutes. […] Sure enough, people made harsher judgments when they were breathing in foul air.
[…] Chenbo Zhong at the university of Toronto has shown that subjects asked to wash their hands with soap before filling out questionnaires become more moralistic about issues related to moral purity (such as pornography and drug use.) Once you're clean, you want to keep dirty things far away. […]
In other words, there's a two-way street between our bodies and our righteous minds. Immorality makes us feel physically dirty, and cleansing ourselves can sometimes make us more concerned about guarding out moral purity. In one of the mist bizarre demonstrations of this effect, Eric Helzer and David Pizarro asked students at Cornell University to fill out surveys about their political attitudes while standing near (or far from) a hand sanitizer dispenser. Those told to stand near the sanitizer became temporarily more conservative.
Moral judgment is not a purely cerebral affair in which we weigh concerns about harm, rights and justice. It's a kind of rapid, automatic process more akin to the judgments animals make as they move through the world, feeling themselves drawn toward or away from various things. Moral judgment is done mostly by the elephant.
I'm not saying we should all stop reasoning and go with our gut feelings. Gut feelings are sometimes better guides than reasoning for making consumer choices and interpersonal judgments, but they are often disastrous as a basis for public policy, science, and law. Rather, what I'm saying is that we must be wary of any individual's ability to reason. We should see each individual as being limited, like a neuron. A neuron is really good at one thing: summing up the stimulation coming into its dendrites to “decide” whether to fire a pulse along its axon. A neuron by itself isn't very smart. But if you put neurons together in the right way you get a brain; you get an emergent system that is much smarter and more flexible than a single neuron.
In the same way, each individual reasoner is really good at one thing: finding evidence to support the position he or she already holds, usually for intuitive reasons. We should not expect individuals to produce good, open-minded, truth-seeking reasoning, particularly when self-interest or reputational concerns are in play. But if you put individuals together in the right way, such that some individuals can use their reasoning powers to disconfirm the claims of others, and all individuals feel some common bond or shared fate that allows them to interact civilly, you can create a group that ends up producing good reasoning as an emergent property of the social system.
Emergence? Yes. Even so, every organization is less than the sum of its members, though its combined capacities are in important ways greater than any single individual's ability. This is how organizations can be both frustratingly stupid from the individual perspective, yet so powerful and productive at the social level.