Inquiries into wrongnessJuly 5, 2010
Here are three interesting articles about being wrong.
Wrong Commencement Speakers! by Timothy Noah of Slate (May 18, 2009). Want a memorable send-off? Invite a speaker who has, at some point in his/her life, undergone spectacular failure!
Success is a wonderful thing, but it tends not to be the sort of experience that we learn from. We enjoy it; perhaps we even deserve it. But we don’t acquire wisdom from it. Failure, on the other hand, is Harvard, Yale, and the University of Heidelberg rolled into one. We may not all possess the self-knowledge to absorb failure’s lessons, just as we may not all graduate with an education to go along with our diplomas. But people typically have a much easier time recounting, in often vivid detail, where they screwed up in life than they do explaining what they did right. Indeed, memoirs of spectacular failure have become a cottage industry—so much so that authors are sometimes tempted to embellish their narratives to make their stories even grimmer than they really are. Stories about life’s wrong turns are sometimes exaggerated but seldom dull.
Risky Business: James Bagian — NASA astronaut turned patient safety expert — on Being Wrong by Kathryn Schulz of Slate (June 28, 2010). Bagian has some excellent ideas on how medical mistakes can be reduced.
…Some things really are medical errors, right? Bad outcomes don’t only happen because a certain piece of information was unknowable or a certain event was unforeseeable. Sometimes doctors just write the wrong prescriptions or operate on the wrong body parts.
That’s true, but if at the end of the day all you can say is, “So-and-so made a mistake,” you haven’t solved anything. Take a very simple example: A nurse gives the patient in Bed A the medicine for the patient in Bed B. What do you say? “The nurse made a mistake”? That’s true, but then what’s the solution? “Nurse, please be more careful”? Telling people to be careful is not effective. Humans are not reliable that way. Some are better than others, but nobody’s perfect. You need a solution that’s not about making people perfect.
So we ask, “Why did the nurse make this mistake?” Maybe there were two drugs that looked almost the same. That’s a packaging problem; we can solve that. Maybe the nurse was expected to administer drugs to ten patients in five minutes. That’s a scheduling problem; we can solve that. And these solutions can have an enormous impact. Seven to 10 percent of all medicine administrations involve either the wrong drug, the wrong dose, the wrong patient, or the wrong route. Seven to 10 percent. But if you introduce bar coding for medication administration, the error rate drops to one tenth of one percent. That’s huge.
…What kinds of tools have you introduced that do work?
One thing we do that’s unusual is we look at close calls. In the beginning, nobody did that in healthcare. Even today probably less than 10 percent of hospital facilities require that close calls be reported, and an even smaller percentage do root cause analyses on them. At the VA, 50 percent of all the root cause analyses we do are on close calls. We think that’s hugely important. So does aviation. So does engineering. So does nuclear power. But you talk to most people in healthcare, they’ll say, “Why bother? Nothing really happened. What’s the big deal?”
How do you get people to tell you about their close calls, or for that matter about their actual errors? Getting people to report problems has always been tricky in medicine.
Yeah, reporting is a huge issue, because obviously you can’t fix a problem if you don’t know about it. Back in 1998, we conducted a huge cultural survey on patient safety, and one of the questions we asked was, “Why don’t you report?” And the major reason — most people think it’s going to be fear of malpractice or punishment, but it wasn’t those. It was embarrassment, humiliation. So the question became, How do you get people to not be afraid of that? We talked about it a lot, and we devised what we called a blameworthy act, which we defined as possessing one of the following three characteristics: it involves assault, rape, or larceny; the caregiver was drunk or on illicit drugs; or he or she did something that was purposefully unsafe. If you commit a blameworthy act, that’s not a safety issue, although it might manifest as one. That’s going to get handled administratively, and probably you should be embarrassed. But we made it clear that blameworthiness was a very narrow case.
At the time that we conducted this survey, we were already considered to be a good reporting healthcare organization; our people reported more than in most places. But in the ten months after we implemented this definition, our reporting went up 30 fold. That’s 3,000 percent. And it has continued to go up ever since—not as dramatically, but a couple of percentage points every year.
You Have Superpowers: How to tap the strange power of being wrong by Adam Waytz of Scientific American (June 29, 2010). I’m not sure that self-delusion should be considered a superpower, but its benefits seem clear in certain circumstances.
In an issue of Behavioral and Brain Sciences, psychologist Ryan McKay and philosopher Daniel Dennett posed the question of whether it can ever be right to be wrong. They argued that false beliefs can be good for you, in at least one particular way. McKay and Dennett systematically assessed a variety of different types of false beliefs — from pathological delusions to biased approximations of facts — for evolutionary value….
McKay and Dennett concluded that only one mistaken belief passes muster, something that they called “positive illusions.” Positive illusions refer to unrealistically positive views of oneself, unrealistically positive optimism toward the future, and unrealistic views of personal control. McKay and Dennett argue that positive illusions are adaptive because they not only enhance psychological well-being — who doesn’t enjoy thinking of themselves as better than others — but because they enhance physical health as well.