Earlier this year I wrote an article for healthcare professionals on safety leadership. The idea is that scientific methods, standardization and reliability are not enough to reduce error and failure. Sometimes human performance in the actual environment of care is what will make the difference - altering care to meet the exact patient needs or conditions in which it is provided.
This makes me think of my favorite boxer and his need to "float like a butterfly and sting like a bee", in order to succeed at his sport. In many ways healthcare professionals also need to float (situational awareness) like butterflies and sting (direct care activities) like bees. But, clinicians can't float or sting like bees with bias. This is where human performance in the actual environment comes back around to safety leadership.
Last week I released a diagnostic guide for physicians and attempted to inject the patient experience for a broader view of situational awareness. One of the physicians who read my guide sent a link to to a paper on cognitive bias in medicine. As I read list of bias described by the authors, I can see where it fit with my experience. Below I describe the potential bias involved in my own misdiagnosis, delay in (accurate) diagnosis and treatment, as well as, offer other examples.
- Base rate neglect - when the underlying incident rates of conditions (rare disease) or the population-based knowledge (breast cancer in a 25 year old) are ignored as if they could not apply to the patient.
- Confidence bias - interpreting information gained on patient to fit preconceived diagnosis, rather than the converse (I wondered what might have caused the patient's peripheral neuropathy and what other clinical findings might exist?")
- Overconfidence - overconfidence in one's diagnostic abilities leading to an error in not accurately diagnosing the patient.
- Search Satisfying - Ceasing to look for further information or alternative answers when the first plausible solution is found.
The authors also offer solutions for avoiding bias. First, training in recognizing bias is important But, other tactics include slowing down, questioning a diagnosis and considering alternatives, using checklists and other methods.
Bias isn't only limited to physicians. In my case, I believe there was also bias demonstrated by my physical therapist and family physician. Diagnostic momentum, or continuing a clinical course of action instigated by previous clinicians, without considering the information available and changing the plan. This is particularly relevant when the diagnosis and treatment plan is initiated by a more senior or specialized clinician.
In summary, "floating like a butterfly" must mean that we recognize our potential bias and we adopt strategies to overcome and avoid. After going through the process of reading the available literature on diagnosing a set of rare conditions, I've begun to believe that patients with the actual conditions should be involved in developing diagnostic guidelines. And, it is the plural "patients", because some will have an atypical presentation, like me. Establishing guidelines that are focused on the most severe or common presentations of a condition increases the risk of diagnostic error for a potentially significant proportion of the population.
The information in this post is important for clinicians to consider, but also for patients and their family caregivers. This is because understanding bias can also make for better self advocacy or advocating for others. If you have ever experienced bias with clinical decisions, feel free to share your example. For a full list of bias in clinical medicine, please see Table 1 in the paper.
Comments
You can follow this conversation by subscribing to the comment feed for this post.