Welcome to the US future: burgeoning favelas leavened only by free Wi-Fi. Some of this has a dystopic, Blade Runner feel — it is striking how much of this economic futurology comes from the US. The more sober UK debate is concerned with deciphering the empirics of the recent past rather than conjecturing about the future.
At the London School of Economics, the respected economist Alan Manning, who has led work on the polarizing impact of technology on the UK job market, laments only half-jokingly that he would like to have the time to develop a new subdiscipline on “science-fiction economics.” It would bring rigor to our understanding of possible societies in which machines do radically more and humans less. For now, Britain looks overseas for visions of where the robots may lead the country.
As with all prophecies of doom, or indeed those of an impending economic boom, we should treat such visions with caution. Predictions about the uniquely transformational yet job-killing impact of technological change are as old as capitalism itself. There has never been an era without plausible experts warning the population that they are on the cusp of a new — usually scary — world resulting from technological breakthrough. Occasionally they are not wrong; mostly they are. Which is not to downplay technology as the motor of economic change. Time and again — from spinning wheel to steam engine — it has had disruptive implications for the workforce. However, labor displaced from field or factory eventually found new, more productive roles, demand expanded, living standards rose.
However, the lag can be a long one. Not long before his death in 1873, John Stuart Mill remarked that the industrial revolution had not yet had much impact. This seemed an extraordinary observation, but it captured at least a partial truth. As the economic historian Brad DeLong has shown, from 1800 to 1870 real working-class wages grew at just 0.4 percent a year, before tripling to 1.2 percent from 1870 to 1950 (reaching almost 2 percent in the golden postwar decades). Similarly, we are yet to experience the true gain, whatever it turns out to be, as well as the pain, of the robot era.
To get a better sense of the impact of technology on the labor market, we do not need to rely entirely on frothy speculation about the future. There is a decade or more of research to draw on. The rise of information and communications technology (ICT) is hardly new. The dominant view is that it has already been eroding a swath of jobs that involve repetitive tasks capable of being automated and digitized. This has disproportionately affected roles in the middle of the income distribution — such as manufacturing, warehousing and administrative roles.
This does not result in lower overall employment — for most economists the main change is to job quality, not quantity. There has been a rapid growth in demand for high-skill roles involving regular interaction with ICT, as well a rise in lower-paid work that is very hard to automate — from caring to hospitality. Consequently the balance of employment has shifted upwards and downwards with less in between; as Manning puts it, the labor market has been polarizing into “lovely and lousy jobs.” The impact of technology has been gradual, but inexorable — “it only goes one way,” he tells me. In some sectors the decline in employment and relative pay has been dramatic: The typical heavy goods driver receives less in real terms today than a generation ago.