Picture this sampling of families one might run into in the ER or clinic:
A single woman with 6 children in the exam room.
A couple, both lawyers from Whitefish Bay, with their daughter who was injured in figure skating competition.
A teen couple with a three year old, and mother is pregnant.
You probably had an image of each of these families in your mind before meeting them. You made some assumptions about their race, education, and social status. And for most of you, I suspect that your image didn’t match the reality: the woman with the 6 children is a Latina, a PhD social worker with kids from 2 families who were involved in a car crash. The couple from Whitefish Bay are an Asian and an African-American, both men. And the pregnant teen mother with the three year old is the white daughter of a professor at an Ivy League school.
We are constantly making assumptions about people based on little to no knowledge. Daniel Kahneman, in his book Thinking: Fast and Slow, refers to heuristics, the mental shortcuts we employ as a means of efficient mental processing. This is the “thinking fast.” When we look outside and see the vapor coming from the heat vent, we don’t first check the thermometer to make the assumption that it’s cold outside. A barking dog with bared teeth elicits an immediate “I’m outta here” without waiting to see if he wants to play fetch. The basic mental mechanism is an adaptive response; early humans whose brains were wired to assume that saber-toothed cats were dangerous and to be avoided were more likely to live to reproduce.
But what is beneficial in one context can cause problems in another. Kahneman documents may examples where these heuristics lead to troublesome biases. That’s why we have also evolved other mental systems – reflective, analytic, “thinking slow” – to question the snap judgments we make based on heuristics. In the context of human interactions, this contributes to all kinds of biased thinking: racism, classism, sexism, etc. As Kahneman shows, we often fail to call on our analytic side to question our biases. Even when we think we are being objective and rational, we are more prone to bias than we realize. While the specific biases are not hard-wired, the tendency toward them is. It’s simply difficult to overcome assumptions. But the cost of not doing so is high. As individuals, it can lead us to make stupid investment decisions, choose poorly in everything from clothing to careers, and take both too many and too few risks. For society, the costs are higher.
I learned an important lesson about making assumptions from a professor in medical school. One day we had a prisoner admitted to the ward, and someone made an offhand and derogatory comment. The attending told the team about the time he was attending a conference with colleagues. In the evening they were walking on the boardwalk, casually dressed, when a police officer approached him, the only African-American in sight. They were looking for a suspect in a crime, and he “matched” the description. He described his complete humiliation as he was handcuffed, frisked, and released only when his white colleagues vouched for him. “You don’t know this patient, and don’t assume you do.”
As the old joke goes, when you assume you make an ass of u and me. Or perhaps far worse. So if you have to assume, assume the best.
[…] In large part, it is a manifestation of the way our minds process information. I have written previously about heuristics – mental shortcuts our brains use to reach conclusions more efficiently. These heuristics are […]