I’m a Doctor, Not a Data Technician!

Guest Editorial

I’d like to spend a moment here to take back the word “provider” in the healthcare context.

It seems I can’t go a day without hearing at least one (and often more) members of a care team rebel against being lumped together not by specialty, credential, or title, but by the vague term providers, as in, those providing healthcare services and expertise. Beyond any argument about whether the speaker’s intent matters when someone experiences a label as being derogatory, I want to reexamine why, in the modern context, “provider” is an important, and accurate, distinction.

People are worried about being replaced, and rendered irrelevant.

From Stephen Hawking to Bill Gates, worry of an Artificial Intelligence (AI) takeover–or robot apocalypse, depending on where you spend your time online–is everywhere.

The Silicon Valley crowd is accosting economists with ideas about Universal Basic Income as an antidote to mass unemployment in the near future, and the rapid advancement of driverless cars has futurists and investors alike chattering about the tradeoffs involved in new technologies that may increase efficiency or safety, but fundamentally disrupt society and the economy it depends on.

The pervasive message is that no industry, no matter how specialized, nor how fundamentally human on its face, is immune to automation’s march toward ubiquity. And somewhere in all this rough and tangle of man vs machine is healthcare, embroiled in its own fight against regulation, price inflation, digitization, and the (supposed) devaluation of medical professionals.

The reality, of course, is that medicine is no more immune to automation than any other industry, even if it has managed to stave off its conversion to digital longer than most. But this is where the term “provider” becomes important: at no point in the near future will doctors, nurses, and other hands-on professions be entirely replaced by any combination of EHRs, wearables, or diagnostic AI. To be a provider of care in the modern, highly automated future is still to be not merely human, but highly trained, experienced, invaluable, and irreplaceable.

Dammit, Jim!

Though still planted firmly here on Earth, many of today’s physicians are feeling more empathetic than ever with Star Trek’s esteemed Doctor Leonard “Bones” McCoy, constantly reminding his peers of what a doctor is, and isn’t. Today, of course, physicians (and their patients) feel that they are being shunted out of “real” medicine to become glorified data entry clerks, a reaction to the time demands (and clunky user interfaces) of EHR systems.

But Bones also exemplifies how automation can mean augmenting, rather than replacing, physicians. If you’ve ever scene Star Trek, you’ll know that the sick bay of the future is rife with computer integration, hand-held diagnostic devices, robotic operating tables, and other instruments all seemingly aimed at rendering human error impossible. But no one questions the value or necessity of having a bonafide doctor close at hand at all times.

For those not interested in science fiction, the same principle applies: digital and automated encroachment is a matter of a shift in the scope of practice, rather than a wholesale rejection of medicine as a discipline.

Enhancement, Not Replacement

Consider the now archaic example of introducing calculators to the classroom. Despite fears that student abilities would be ruined and learning impeded, it is now normal–mandatory, depending on the subject level–for calculators to feature in the teaching of math. Having a powerful tool doesn’t eliminate the need for a high-level understanding of the underlying theory, or the need for correct application of the tools.

“The basic materials that have been taught in the laboratory for years are still relevant and important, but there is a need to incorporate critical new skills,” write professors Joel Mortensen and Beth Warning. “These new skills may displace traditional didactic and laboratory materials that have been utilized over the past decades.”

As instructors for the University of Cincinnati’s Medical Laboratory Science program, their observations are telling: the field is not going away–not by a long shot, if the Precision Medicine Initiative and the associated rise of genomic testing as a primary care component are any indication–but the nature of the work is changing.

Mortensen and Warning go on to describe a new technologist encountering an unfamiliar culture in her work:

“What was her next action? A Google search, of course!”

Doctors can wince at the prospect of internet searches entering the clinical space, but that is the reality for patients and physicians alike: our knowledge needs technological augmentation to keep up with changing facts. Striking a balance between reliance on new tech and the classical art of medicine is matter of applied critical thinking, not a compromise of integrity or capitulation to overzealous interlopers.

Whether the tool is a calculator or a voice-activated diagnostic AI virtual assistant, critical thinking is not a nice-to-have, it is the skill that makes the whole new system work. And the sort of applied critical thinking that distinguishes medicine is still cultivated through devoted study, practice, and hard-earned expertise.

It may not look like it now, but the future of information systems design is all about user experience. Meaningful Use and its tendrils have obscured that mission by focusing on adoption at the expense of everything else. That brings us to a crossroads: adjust course to make users (clinicians) the ultimate measure of system quality, or see the entire system stagnate for another few decades. There is no getting around the providers in shaking up the system; technology, policy, and practice all rely on provider buy-in in the long run.

The Trade-Off

If digitization and the increasing automation of healthcare functions threatens anyone, it is the administrators. When examining the correlation between healthcare expenses and outcomes, administration is already an area flagged for reduction and role alignment. But part of the payoff for caregivers learning to cope with greater data management demands is that they are slowly eroding the need for administrators to facilitate the smooth operation of healthcare, even in an environment of burdensome regulation and scrutiny.

Robots cannot immediately replace a doctor or bedside nurse, but they can certainly automate away costly micromanagement and bureaucratic decision-making networks.

Yes: if all caregiving roles are going to be increasingly augmented by technology and rely heavily on data management as an element of modern medicine, then the education we provide, the functions caregivers perform, and the distinctions we draw between different caregiver roles–PAs, physicians, and all the various nurses–will necessarily change. Evolution has never been a question of values, tradition, or relative status. Medicine and society, are transforming, and that creates great opportunity without eliminating the need for talented practitioners.

When we talk about providers, then, let there be no mistake: we are not diminishing the role of the caregiving professionals; we are highlighting the individuals who make the care team, with all its high tech amplification, function.

Edgar Wilson is an Oregon-based independent consultant who writes on trends in education, healthcare, and public policy.


2 responses to "I’m a Doctor, Not a Data Technician!"

Leave a Reply

Your email address will not be published. Required fields are marked *