By Trevor Jamieson
Near my home are two businesses who have just implemented mobile pay technology. The response to the implementation of very similar technology in these two locations is drastically different.
At a local coffee-chain, there are now effectively two tiers of customer: those like myself, early adopters, who walk past a line of 5-10 people, grab coffee, and walk out, and those who wait in a line that is a similar length to before. At a local fast food chain, where the patrons are largely from high school or university, many are early adopters, and, without question, the line – not the line to order, but the line to get your food – has increased by a good 25%. Whether I counter order or not, I wait longer than I used to.
Why do those at the former wait in line? Perhaps they are less comfortable with new technology, perhaps they are less predictable (where online ordering is much simpler if you consistently order from a short list of items), or perhaps they just like talking at the counter. Why aren’t those at the latter demanding more staff in the kitchen? Observationally, waiting in line is not a painful experience – many waiting are actively socializing in person or via mobile devices. The more ‘get in, get out’ customers like myself are a grumbling minority.
So, what does this all have to do with healthcare? Humans imply uncertainty and complexity.
While we often design systems and tools assuming people are deterministic automatons, they are not – they respond variably to external stimuli and not entirely predictable. This is what makes many systems in healthcare “complex”, as compared to “complicated”. Complicated systems are reducible into component parts that interact in predictable ways, like building a bridge, whereas complex systems are made of highly entangled components that interact perpetually and are very challenging to accurately predict, like stock markets, the weather, or consumer behavior.
In health, systems where free-willed people are integral to the workflows will generally be complex. Closed systems of computers alone or those where the people are highly constrained or stereotypical in their behavior will generally be complicated.
At eHealth this year, there are many sessions about novel technologies and tools: apps, wearables, communication tools, and even a plenary session about going to space. On May 30, a session is dedicated to the next decade’s technological disruptions. When a new tool interacts with free-willed people, these people may react in a multitude of ways: they may use it/respond to it as intended, they may reject it, they may use it productively in unexpected ways, they may develop occult workarounds, etc, etc? We see all these reactions in health.
A fatal error in a complex system is to assume that you know exactly what will happen in advance, to then design a tool where input A always implies action B, and to then “go-live” with little to no ability to reshape the product afterwards. These systems often won’t work the way we think – forcing expensive “do-overs” or tinkering that often fails to correct the fundamental flaws.
In complex systems, we must instead deploy tools more iteratively. Rather than having “go-live” be the end, we need to move “go-live” closer to the beginning, and to then build the system through incremental changes and assessments of response. This is not hacking – the choices of the next iteration and the assessment of response must be purposeful, not unlike in quality improvement. Agile models like these will be discussed in both session RF07 and a Gevity sponsor symposium on May 29.
Key to this process is data – we must know what is happening to the outcomes of interest if we are to rationally iterate. Many sessions at eHealth address the importance of data (OS01, OS05, OS10, RF03, PS03, OS22, PS06), and I’ll be checking them out.
This is not to say that every problem in healthcare is complex – those that are truly complicated can, and often should, be done the “old way”. For the complex, however, we should never underestimate people and the challenges with accurately modeling their behavior.
At the end of the day, people are going to do what people are going to do; some may wait in line just to chat to the person at the counter, some may take whatever actions are needed to bypass the line, and others just may not care that much about the line at all. Incentives will vary across situations, demographics, individuals, and seemingly unrelated externalities. Sometimes you will only know the ultimate behavior after observing it post go-live; the key is to be agile enough to course correct as required.
We need never forget that healthcare is a social enterprise, and that, therefore, people (clinicians, patients, and everyone else who keeps the system running) are our business. They are what makes this business very hard…but they are also what often makes it extremely rewarding; deterministic automatons aren’t nearly as appreciative.