When Henry Ford arranged production of his vehicles on an assembly line, he was laying the foundation for industrial robots to take over the bulk of construction and assembly from humans. It doesn’t matter that such advanced robots didn’t exist when the assembly line revolutionized manufacturing; the important thing is that it standardized the functions of the laborers.
It is worth remembering that the word “robot” derives from a Czech word for “slave” or “servant.” Robots aren’t meant to take our jobs, per se–they are designed to do our work. It just happens to be that our jobs are comprised mostly of work.
The more standardized the work of a given job, the more readily automation can take over for the human laborers. So while America bemoaned the export of its manufacturing sector to China, the Chinese labor market is experiencing sympathy pains as robots preclude the need for any further outsourcing.
Only Human No More
Thanks to more than a century of obsession with standardization of just about everything, we are quickly facing the reality of automation spilling over from the places in which it is expected or even tolerated, to areas where we are challenged to defend why certain functions must be performed by humans.
The parallel between manufacturing and medicine, with respect to automation, is sound. Beyond standardization, the places where robots look poised to take over the soonest are also the places where people have long been expected to work seamlessly with machines. People in general are comfortable when this is applied to manufacturing, because it has always obliged workers to utilize tools.
But healthcare specialists of various stripes are also expected, as a matter of routine, to utilize tools and increasingly smart machines. Most obviously, the implementation of Electronic Health Records showed how the entire industry can be shifted toward greater standardization and automation. Every point of care is now driven by documentation, EHR workflows, and pressure from administrators and payers and regulators all counting on the EHR data to describe outcomes, quality of care, and more.
People complained–and continue to complain–that in healthcare, they are not robots. Industry and technology reply with a collective shrug: not yet.
It isn’t all for the worse. Indeed, EHRs were rolled out on the premise that they, and the behaviors they require, would collectively improve patient care and clinical performance, all while saving money (at least, in the long run).
But the automation revolution in healthcare goes beyond records-keeping and data entry. Standardization in the diagnostic lab has already proven how adding more machines to take over tasks formerly completed by technicians turns standard procedures into automated upgrades. Even as simple a procedure as staining a slide for analysis, with machine-assistance, removes the variability of manual labor with robotic precision.
Siemens has a virtual library of case studies regarding laboratory automation, and all of them point toward better outcomes, more efficiency, better use of staff time, and cost savings. Each example turns one or two standardized procedures into the busywork of robots, and eliminates previously tolerated or ignored variables into hyper-precise diagnostics (and by extension, clinical) outcomes. It also means less need for lab technicians on site, as previously time-consuming tasks can be offloaded to tireless machines.
Manual Actions, Cognitive Tasks
Remember that “robots” are servants, laborers. That doesn’t limit their utility to manual functions. Inside the radiologic lab, we see another example of automation, layering more doubt over what functions humans are better equipped than robots to perform.
Which role can be more easily automated: the radiologic technician, or the radiologist? On the one hand, you have professionals working with both patients–human interaction–as well as handling equipment. Our lab techs already showed this latter function can benefit from the robot’s touch. On the other hand, you have professionals interpreting imaging results and making diagnoses. Critical thinking, judgement, experience–this is the stuff that humans and experts are made of.
It turns out that supercomputers–thinking robots that interact with data, rather than with objects–can be trained to become pretty astute diagnosticians. These e-consultants are programmed to analyze the outputs from each of most common radiologic imaging technologies and produce an assessment consistent with the best, most current medical knowledge.
Industry experts are already asking not just if, but when it will become mandatory for all such diagnostic judgements to come from robots, rather than physicians. Meanwhile, no one is arguing too hard for the technicians guiding patients through the imaging process to be replaced by some combination of moving sidewalks or robot aids. But in a future where working and living alongside robots is normal (preferable, even), will patients really need that human hand to hold as they sit down for an X-Ray?
Robots for the Masses
The full potential of automation isn’t just in precision or task-specific performance, but in it’s ability to scale. We can see just what that looks like through the lens of one final area of diagnosis: epidemics. Consider: when does something become an epidemic, who makes that determination, and how does that influence treatment?
Unlike other diagnostic roles, this one falls under the public health umbrella, often an assortment of government agencies (like the Center for Disease Control or a Department of Health), non-profits (like the Red Cross), and multinational organizations (such as the World Health Organization). These organizations comprise clinical experts as well as strategists, analysts, and humanitarian professionals, all working with the collective mission of anticipating, identifying, and intervening where disease threatens populations.
Technology is changing the human element of these roles as well, especially where detection and diagnosis is concerned. As typists and taxi drivers can attest, it isn’t just the robots that can displace human professionals, but systems and programs like Microsoft Word or Uber. Rather than reading slides or X-Rays, automated programs and algorithms track social media, news, and other public resources to find patterns suggestive of disease spreading. This automated disease-mapping allows agencies to visualize patterns that would otherwise be impossible to detect, and speeds up considerably the response time and planning support for any real or potential outbreak.
It isn’t a stretch to imagine these electronic agents being given autonomy to, say, publish a travel warning, alert area hospitals to be on the lookout for certain patients or symptoms, or even begin redirecting traffic in transit to mitigate and control the spread of disease.
Once the robots master a single function, their utility tends to spread. Anything that can be standardized–from emergency protocol to preparing equipment–is fodder for automation. In the end, EHRs may be the worst example of technology taking over the healthcare sector; thus far, EHRs still don’t follow a single standard. For now, at least, humans are still essential to making the data move.