Because information technology (IT) has so quickly transformed people’s daily lives, we tend to forget how much things have changed from the not-so-distant past. Today, millions of people around the world regularly shop online; download entire movies, books and other media onto wireless devices; bank at ATMs wherever they choose; and self-book entire trips and check themselves in at airports electronically.
However, there is one sector of our lives where adoption of information technology has lagged conspicuously — healthcare.
Some parts of the world are doing better than others in this respect. Researchers from the Commonwealth Fund recently reported that some high-income countries, including the UK, Australia and New Zealand, have made great strides in encouraging the use of electronic medical records (EMR) among primary-care physicians. Indeed, in those countries, the practice is now nearly universal.
Yet some other high-income countries, such as the US and Canada, are not keeping up. EMR usage in the US, the home of Apple and Google, stands at only 69 percent.
The situation in the US is particularly glaring, given that healthcare accounts for a bigger share of GDP than manufacturing, retail, finance or insurance. Moreover, most health IT systems in use in the US today are designed primarily to facilitate efficient billing, rather than efficient healthcare, putting the business interests of hospitals and clinics ahead of the needs of doctors and patients.
That is why many Americans can easily go online and check the health of their bank account, but cannot check the results of their most recent laboratory work.
Another difference between IT in US healthcare and in other industries is the former’s lack of interoperability. In other words, a hospital’s IT system often cannot “talk” to others. Even hospitals that are part of the same system sometimes struggle to share patient information.
As a result, today’s health IT systems act more like a frequent flyer card designed to enforce customer loyalty to a particular hospital, rather than an ATM card that enables you and your doctor to access your health information whenever and wherever needed. Ordinarily, lack of interoperability is an irritating inconvenience. In a medical emergency, it can impose life-threatening delays in care.
A third way that health IT in the US differs from consumer IT is usability. The design of most consumer Web sites is so obvious that one needs no instructions to use them. Within minutes, a seven-year-old girl can teach herself to play a complex game on an iPad.
However, a newly hired neurosurgeon with 27 years of education may have to read a thick user manual, attend tedious classes and accept periodic tutoring from a “change champion” to master the various steps required to use his hospital’s IT system. Not surprisingly, despite its theoretical benefits, health IT has few fans among healthcare providers. In fact, many complain that it slows them down.
Does this mean that healthcare IT is a waste of time and money?
In 2005, colleagues of ours at the RAND Corporation projected that the US could save more than US$80 billion a year if healthcare could replicate the IT-driven productivity gains observed in other industries. The fact that the US has not gotten there yet is not a problem of vision, but of less-than-ideal implementation.