Your smartphone allows you to get almost instantaneous answers to the most obscure questions. It also allows you to waste hours scrolling through Facebook or looking for the latest deals on Amazon.
More powerful computing systems can predict the weather better than any meteorologist or beat human champions in complex board games such as chess.
However, for several years, economists have asked why all that technical wizardry seems to be having so little effect on the economy. The issue surfaced again recently, when the government reported disappointingly slow growth and continuing stagnation in productivity. The rate of productivity growth from 2011 to last year was the slowest since the five-year period ending in 1982.
Illustration: Mountain People
One place to look at this disconnect is in the doctor’s office. Peter Sutherland, a family physician in Tennessee, made the shift to computerized patient records from paper in the past few years. There are benefits to using electronic health records, but grappling with the software and new reporting requirements has slowed him down, he said.
He sees fewer patients and his income has slipped.
“I’m working harder and getting a little less,” he said.
The productivity puzzle has given rise to a number of explanations in recent years — and divided economists into technology pessimists and optimists.
The most prominent pessimist is Robert Gordon, an economist at Northwestern University. His latest entry in the debate is his new book, The Rise and Fall of American Growth. Gordon contends that the current crop of digital innovations does not yield the big economic gains of breakthrough inventions of the past, like electricity, cars, planes and antibiotics.
The optimists are led by Erik Brynjolfsson and Andrew McAfee, codirectors of the MIT Initiative on the Digital Economy. They argue that there have always been lags between when technology arrives and when people and institutions learn to use it effectively. That has been true for a range of technologies, including the electric motor and the Internet, which contributed to the last stretch of healthy productivity growth in the late 1990s and early 2000s.
The gains from current tech trends like big-data analysis, artificial intelligence and robotics will come, they say. Just wait.
Some economists insist the problem is largely a measurement gap, because many digital goods and services are not accurately captured in official statistics. However, a recent study by two economists from the US Federal Reserve and one from the IMF casts doubt on that theory.
Technology spending has been robust, rising 54 percent over a decade to US$727 billion last year, according to the research firm International Data Corp. Despite all the smartphone sales to consumers, most of the spending is by companies investing in technology to increase growth and productivity.
However, an industry-by-industry analysis, published by the McKinsey Global Institute, the research arm of the consulting firm McKinsey & Co, found that the march of digital technology across the economy has a long way to go.
The McKinsey researchers examined 22 industries, measuring not only investment, but also the use of technology to change how work is done. Some industries, like technology, media and financial services, were well along, while others, like healthcare and hospitality, trailed.
Only 18 percent of the US economy is living up to its “digital potential,” the report concluded.
Moreover, if lagging industries do not catch up, we will not see much of a change in national economic statistics, McKinsey Global Institute director James Manyika said.
Since the financial crisis, the administration of US President Barack Obama has moved aggressively to push medicine into the digital age. As part of the economic recovery package, the US Congress enacted the Health Information Technology for Economic and Clinical Health Act of 2009. The legislation provided for federal incentive payments of US$44,000 a physician to shift to electronic health records.
The billions of US dollars in subsidies were intended to accelerate adoption. And from 2008 to 2014, the share of hospitals with electronic health records rose to 75 percent from 9 percent, while the adoption rate in doctors’ offices rose to 51 percent from 17 percent, according to the most recent surveys by the American Hospital Association and the government.
“The government funding has made a huge difference,” said Ashish Jha, a professor at the Harvard School of Public Health. “But we’re seeing little evidence so far that all this technology has had much effect on quality and costs.”
The electronic records represent only a first step toward curbing costs and improving care, health experts say.
“People confuse information automation with creating the kind of work environment where productivity and creativity can flourish,” said David Brailer, who was the national health technology coordinator in former US president George W. Bush’s administration. “And so little has gone into changing work so far.”
The government incentives came with timetables for adopting different levels of use and new reporting requirements, with the prospect of financial penalties for doctors and hospitals that fell behind. The early goals for adopting electronic records were reasonable, but the later stages were too aggressive, health experts say.
Overwhelmed doctors protested and the administration recently shelved the previous timetable, stretching out schedules and modifying some reporting rules.
Healthstar Physicians, the 50-doctor group in Morristown, Tennessee, where Sutherland practices, was spurred to go electronic by those federal incentive payments, which now total US$32 billion. However, the cultural adjustment to digital technology has been challenging.
Sutherland and his colleagues evaluated several technology providers and chose Athena Health, which does not sell software, but is paid a percentage of its customers’ revenue. Healthstar started using Athena’s cloud software in 2012, first for billing and then for electronic health records. Athena’s share is less than 5 percent of the group’s revenue.
Today, Sutherland’s personal income and the medical group’s revenue are about 8 percent below where they were four years ago. However, last year, both his earnings and the revenue of Healthstar, which employs 350 people in 10 clinics, increased slightly, by nearly 3 percent from 2014.
Sutherland decided he did not want a computer screen separating him from his patients. So he opted for a tablet computer, making it easier to keep eye contact.
Not a fast typist, Sutherland decided to use voice recognition software. For six months, he stayed up until midnight most nights, training the software until its speech recognition engine could transcribe his comments into text with few mistakes.
Sutherland bemoans the countless data fields he must fill in to comply with government-mandated reporting rules, and he concedes that some of his colleagues hate using digital records.
Yet Sutherland is no hater. Despite the extra work the new technology has created and even though it has not yet had the expected financial payoff, he thinks it has helped him provide better information to patients.
He values being able to tap the screen to look up potentially harmful drug interactions and to teach patients during visits. He can, for example, quickly create charts to show diabetes patients how they are progressing with treatment plans, managing blood glucose levels and weight loss.
He is working harder, but he believes he is a better doctor, Sutherland said.
Blunt measures of productivity are not everything, he added.
“My patients are better served,” he said. “And I’m happier.”
Could Asia be on the verge of a new wave of nuclear proliferation? A look back at the early history of the North Atlantic Treaty Organization (NATO), which recently celebrated its 75th anniversary, illuminates some reasons for concern in the Indo-Pacific today. US Secretary of Defense Lloyd Austin recently described NATO as “the most powerful and successful alliance in history,” but the organization’s early years were not without challenges. At its inception, the signing of the North Atlantic Treaty marked a sea change in American strategic thinking. The United States had been intent on withdrawing from Europe in the years following
My wife and I spent the week in the interior of Taiwan where Shuyuan spent her childhood. In that town there is a street that functions as an open farmer’s market. Walk along that street, as Shuyuan did yesterday, and it is next to impossible to come home empty-handed. Some mangoes that looked vaguely like others we had seen around here ended up on our table. Shuyuan told how she had bought them from a little old farmer woman from the countryside who said the mangoes were from a very old tree she had on her property. The big surprise
Ursula K. le Guin in The Ones Who Walked Away from Omelas proposed a thought experiment of a utopian city whose existence depended on one child held captive in a dungeon. When taken to extremes, Le Guin suggests, utilitarian logic violates some of our deepest moral intuitions. Even the greatest social goods — peace, harmony and prosperity — are not worth the sacrifice of an innocent person. Former president Chen Shui-bian (陳水扁), since leaving office, has lived an odyssey that has brought him to lows like Le Guin’s dungeon. From late 2008 to 2015 he was imprisoned, much of this
The issue of China’s overcapacity has drawn greater global attention recently, with US Secretary of the Treasury Janet Yellen urging Beijing to address its excess production in key industries during her visit to China last week. Meanwhile in Brussels, European Commission President Ursula von der Leyen last week said that Europe must have a tough talk with China on its perceived overcapacity and unfair trade practices. The remarks by Yellen and Von der Leyen come as China’s economy is undergoing a painful transition. Beijing is trying to steer the world’s second-largest economy out of a COVID-19 slump, the property crisis and