We all know at least one person – who may or may not be considered a hypochondriac – who self-diagnoses his or her own health maladies. We sometimes know them well enough they will admit that, when they go the doctor and the doctor runs her own tests, the diagnosis is quite different. I’ll be the first to admit I do this. I recently visited an orthopedist – a hand specialist – because I was sure I had begun to develop arthritis in one of my fingers. But after actual tests (x-ray) and expert analysis from the doctor, I realized that wasn’t the diagnosis at all. The problem was attributed to catching myself when I slipped on ice a few months ago, nothing more alarming than that. Whew.
Self-attestation can be terribly inaccurate. My opinion is that the root of that inaccuracy lives in human nature. We want to take information we think we know, assume it’s more complete and correct than it often is, apply past lessons learned, and arrive at a conclusion that tells us there is no need for deeper investigation – we’ve got the answer! Digging deeper takes more time, maybe more money – resources that always seem to be in short supply. But self-attestation can lead to some very big problems.
Two recent events cast a particularly dark shadow on self-attestation. In Brazil, a mining company used a certifying body to which it was closely tied – not an independent one – to essentially self-certify safety of a dam constructed in the course of mining operations. When it burst in January 2019, killing some 300 people and destroying many homes and businesses, investigators found the dam had not, in fact, been built in accordance with standards. But no one checked independently.
And of course, we have the recent crashes of two Boeing 737 MAX jets. It has been determined that Boeing decided on their own which technical complexities of the flight control systems to explain to the FAA and to pilots and which ones would constitute “too much information” so to speak. That decision turned disastrous for the people on those planes, their loved ones, Boeing, and their shareholders.
Unfortunately, despite high profile failures in the doctrine of self-attestation, it remains an acceptable practice in too many important domains. One domain that bothers me a lot, and the reason I’m writing this blog entry, is healthcare data sharing. The U.S. Dept. of Health and Human Services (HHS) Office of the National Coordinator for Health IT (ONC) remains committed to self-attestation as the acceptable way for electronic health record (EHR) system vendors to prove their compliance with and conformance to health information exchange (i.e., data sharing) standards. Is software quality for health IT systems as important as mining dam construction and flight control system education? Why would it not be? We’re talking about systems that share information clinicians use to make diagnoses and develop care plans. These are – or can be – literally life and death decisions.
AEGIS’ Founder, Mario Hyland (@InteropGuy) recently observed to me:
At least with paper records, doctors knew what records they had received and which ones they hadn’t. With EHRs if interoperability isn’t working, the doctor may not even realize that important records were never exchanged. In other words, the doctor doesn’t know what he doesn’t know.
I want to ask, for the love of mankind, that we stop the practice of considering self-attestation of software quality good enough, especially in domains as important as healthcare IT. Demand that more sets of eyes, more tools, etc. – specifically more than those of the good folks who developed the software alone – verify that quality. Don’t let human nature so easily remain the weak link in better healthcare delivery.
Now, I feel the urge to sneeze. I’m sure I must have the flu…