A neurosurgeon, a minister, and a nurse walk into a bar…. (There’s no punch line, though I invite suggestions for a good joke.) I’m willing to bet that in picturing the scenario the vast majority of you imagined two men and a woman. I’d even go so far as to say that the woman was the nurse. It’s an example of how our thinking is influenced by our most recent experiences. If you work at Children’s Hospital of Wisconsin, the last nurse you met was very likely female, and the last neurosurgeon was certainly a male. It is what is referred to in the psychology of decision making as the “availability heuristic”: when we make judgments without complete information, we tend to refer to our most recent experiences, relying on the information we have available by easy recall to fill in for the information that is missing. (A heuristic is a mental shortcut – there are many types, this being just one.) Not knowing the sex of the characters, I draw on the most recent prior information I have about the sex of a neurosurgeon or a nurse.
Short cuts like this evolved as a way for our minds to function more efficiently. When asked “Think of a common man’s name that starts with P,” it is far easier for me to conjure up the last man with that name that I interacted with (Peter) than to call up in my mind the complete list of men’s names beginning with P (Paul, Philip, Patrick, Pedro, Pradeep, etc.) and thinking about how many people have each of them. In many circumstances, the availability heuristic works well and allows us to act on incomplete information.
You could argue that it’s simply a matter of playing the odds. In the US, the majority of neurosurgeons and ministers are men, and the majority of nurses are women. But research shows that we actually are not all that good at thinking statistically, and that playing the odds is often trumped by recent experience. When recent experience is not representative of reality, this mental shortcut leads to bias. For example, we recently had a patient in the ED who had just arrived from Liberia with high fever and upper respiratory symptoms. Which is the most likely diagnosis: a) malaria, b) a cold, c) Ebola? If Ebola even crossed your mind then you are displaying the availability bias; a cold is several orders of magnitude more likely based on actual prevalence.
Non-representative recent experience can steer us wrong in many ways. It’s a common problem in medical diagnostic decision making, especially among non-experts. I remember as a fellow seeing a teen with severe abdominal pain, to the point that he was irrational. I had recently read about acute intermittent porphyria, which can cause abdominal pain and altered mental status, and promptly ordered a urine porphobilinogen level to test for it. Never mind that it has an incidence of around 1 in 50,000. Not only was I wrong, it delayed me from treating his pain and making the actual diagnosis (kidney stone, incidence about 1 in 10, though less common in teens). I suspect the availability bias explains a good deal of the higher cost of care provided by medical trainees. The first time a resident sees someone with a rare illness, they start to evaluate more patients for that problem. It’s also a culprit in driving some utilization by patients. When the media run sensational reports about uncommon conditions, people overestimate their risk and often seek unnecessary medical care.
The availability heuristic also leads to broader bias in society. For instance, young blacks are arrested for marijuana possession at much higher rates than young whites, despite having a similar frequency of drug use. Blacks thus have higher rates of incarceration, and news stories about drug arrests are much more likely to feature African-Americans. As a result, people (both blacks and whites) overestimate the proportion of criminals that are black. In one study, 60% of viewers of a crime story without a picture of the suspect falsely recalled seeing one, and 70% believed that the suspect was African-American. After all, the last news story they saw about crime was likely to have featured a black suspect: availability bias. Similarly, low income individuals are more likely to be prosecuted for child abuse, leading us to believe – incorrectly – that those who are more well off are unlikely to maltreat their children, and potentially missing an opportunity to intervene when necessary.
There are many examples of how our use of this mental shortcut can lead us not only to misrepresent how common or uncommon something is across a group, but also to misapply the most readily recalled information about groups to individuals. Even when the most recent image is truly representative (e.g., most nurses at Children’s are female), it may not apply to a given individual. (Just ask any of the 3 male nurses I worked with in the ED yesterday!)
The availability heuristic is just one of the filters we all see the world through. Like other filters, it’s not necessarily either good or bad, but it is something to be aware of. When we make a snap judgment without having all the information, we need to be aware that we are overly influenced by our most recent experience and by the way things are portrayed – correctly or not – in society at large, and be willing to reshape our initial image as we get more information. And while some people cry “political correctness” when we use gender-neutral language or multiracial images, a non-biased environment is an important way to make our mental images more accurate. I know more than a few women neurosurgeons, female ministers, and male nurses who would appreciate it.