The real problem is not whether machines think, but whether men do.
– B. F. Skinner
There has been a great deal of wringing of hands and gnashing of teeth about the lack of generalized improvements in health care delivery as we approach the beginning of the end of initial electronic health record (EHR) adoption. For some, self-flagellation might be appropriately added when realizing that the purchase and implementation of such platforms can cost up to $70,000 per care provider. By the end of this year, it is anticipated that the overall EHR market may reach the astounding figure of more than $6 billion dollars annually.
Unfortunately despite the cost, there is a “dirty little not so secret” among the majority recently implementing EHRs – that productivity (moving patients through a health care experience and documenting the encounter) has worsened rather than improved. Most would identify with a recent health system study demonstrating that while EHR implementation may have a positive financial impact, work productivity can suffer. Another national survey of more than 200 ambulatory and hospital-based physicians across the U.S. recently reported that when it comes to working efficiency, the overwhelming majority are underwhelmed. Two-thirds (66%) said that they were seeing fewer patients than before, and the vast majority (85%) reported spending more time documenting visits.
A small number of organizations, namely those that have been in the EHR game for much longer periods of time, have realized improvements in clinical work flow and productivity using these tools. But I will go out on a relatively strong limb and say that in the moment, these organizations are absolute exceptions to the rule.
Is this a cause for alarm? Were the billions of dollars transferred from the several (health systems everywhere) to the few (the handful of EHR vendors located in Madison, Wisconsin and a couple of other locations) ill-advised? As is often the case, there are some interesting lessons from history that can be brought to bear.
It is useful to begin with the concept of the “general purpose technology (GPT).” A GPT is a technology that can affect an entire economy, and alter economic and social structures. These technologies also usually have three more specific characteristics – pervasiveness, an ability to improve over time and lower the costs of its users, and encourage the invention and production of new products and processes. As a GPT can be a product, process or system, some notable examples would include: domestication of plants and animals, the wheel, mass production, electricity, nuclear energy and the computer.
One could imagine a long list of GPTs specific to medicine, including such things as general anesthesia and the X-ray. What I would suggest is that the EHR is a prototypical GPT for health care – one that meets all of the foregoing criteria – it has already had an impact on the economics and structure of care, and is pervasive. It has proven the ability to improve over time, and it is directly responsible for the health care digital technology-enabled revolution, therefore dramatically fulfilling the enabler of new products and processes requirement.
However, the most fascinating thing about our current EHR experience, and the important historical lesson I allude to has nothing to do with those important characteristics, but rather the answer to this question – how could the adoption of a health care GPT with so much promise lead to a decrease in industry productivity?
In their recent book, The Second Machine Age, MIT Erik Brynjolfsson and Andrew McAfee discuss the concept of the “productivity paradox,” and referenced some of Brynjolfsson’s earlier work on the topic as well as work by Chad Syverson at the University of Chicago and Paul David at Stanford. David sums the overarching concept up nicely by saying that if we consider “thinking about transitions from established technological regimes to their respective successor regimes, many features of the so called productivity paradox will be found to be neither so unprecedented nor so puzzling as they might otherwise appear.”
What they tell us is that there were two previous slowdowns in productivity that were not anticipated, and caused great consternation – the adoption of electricity and the computer. The issues at hand with both were the protracted time it took to diffuse the technology, the problem of trying to utilize the new technology alongside the pre-existing technology, and the misconception that the new technology should be used in the same context as the older one.
Although the technology needed to electrify manufacturing was available in the early 1890s, it was not fully adopted for about thirty years. Many tried to use the technology alongside or in conjunction with steam-driven engines – creating all manner of work-flow challenges, and it took some time to understand that it was more efficient to use electrical wires and peripheral, smaller electrical motors (dynamos) than to connect centrally-located large dynamos to the drive shafts and pulleys necessary to disperse steam-generated power. The sum of these activities resulted in a significant, and unanticipated lag in productivity in industry between 1890 and 1920.
An old voltmeter sits on display at the Siemens AG dynamo factory in Berlin, Germany, on Tuesday, Sept. 28, 2010. Siemens AG, Europe’s largest engineering company, predicted an increase in fiscal fourth-quarter profit to ‘very satisfactory’ levels as orders for industrial equipment rebound. Photographer: Michele Tantussi/Bloomberg
It does not take a great deal of inductive logic to understand the similarities between this period and the initial use of computing in business and manufacturing – slow diffusion, parallel technology use, and attempts to use the new in the old context. So not surprisingly, a strikingly similar slowdown in productivity was noted between the late 1970s and the mid-1990s – causing the Nobel (Riksbank) Prize-winning economist Robert Solow from MIT to remark that “we see the computer age everywhere, except in the productivity statistics.”
I hope that those reading this have already connected the dots to our current EHR “productivity paradox” in health care. Despite the fact that most have adopted, or are adopting these systems, it has been a relatively long diffusion period, and similar to the previous two example, we are living in the period of parallel technology use (digital and paper) in most institutions, even those that are “paperless.” Finally, we are literally so embedded in the use of this medical GPT that we cannot stop to “see” how we are using it in old contexts, but we are.
The good news, however, is substantial. In the two decades following the adoption of both electricity and the computer, significant acceleration of productivity was enjoyed. The secret was in the ability to change the context (in the case of the dynamo, taking pulleys down for example) assisting in a complete overhaul of the business process and environment, and the spawning of the new processes, tools and adjuncts that capitalized on the GPT.
We do not fully grasp what the new context in health care will be, or what exploring this particular frontier will demonstrate, but we know that it will involve the use of GPT-additive digital technologies in entirely new ways, and in new models of both the work flow and delivery of health care, and that productivity will likely improve dramatically as a result. We may also begin to think of the concept of productivity much differently as a result (i.e., what “patient throughput” actually means).
Regardless the future onslaught of “artificial intelligence”, human creativity will be among the last to fall to the inanimate. We have, as Skinner reminds, the unique ability to leverage technology, and not vice versa, to make our brief lives better.