top of page
Writer's pictureOurStudio

Were the Luddites Right?

In 1948 Norbert Wiener, the father of cybernetics, wrote an urgent letter to Walter Reuther, the president of the Union of Automobile Workers. Wiener warned Reuther that the combination of production machinery with computing machines would soon yield an "apparatus [that] is extremely flexible, and susceptible to mass production, and will undoubtedly lead to the factory without employees; as for example, the automatic automobile assembly line." Wiener ominously concluded, "In the hands of the present industrial set-up, the unemployment produced by such plants can only be disastrous."

The mass unemployment that Wiener predicted did not occur. As technology advanced, the number of employed workers in the United States increased from 59 million in 1950 to a peak of 146 million in 2007, and GDP grew from $2 trillion in 1950 to $13.6 trillion in 2012 (in 2005 dollars).

Now, two centuries after the original Luddites smashed then-newfangled weaving frames in northern England, predictions of permanent technological unemployment have been revived. In a December working paper for the National Bureau of Economic Research titled "Smart Machines and Long-Term Misery," Columbia University economist Jeffrey Sachs and Boston University economist Laurence Kotlikoff ask, "What if machines are getting so smart, thanks to their microprocessor brains, that they no longer need unskilled labor to operate?"

Sachs and Kotlikoff are not alone in worrying how technological progress will affect employment. Progress "has sped up so much that it's left a lot of people behind," write Erik Brynjolfsson and Andrew McAfee of MIT's Center for Digital Business in their 2011 book Race Against the Machine. "Many workers, in short, are losing the race against the machine." In a 2011 McKinsey Quarterly article, Santa Fe Institute economist Brian Arthur describes automation as "a second economy that's vast, automatic, and invisible." In Arthur's view, "The primary cause of all of the downsizing we've had since the mid-1990s is that a lot of human jobs are disappearing into the second economy. Not to reappear."

As evidence that American workers are losing to the machines, Brynjolfsson and McAfee point to falling real wages for unskilled workers in the United States. The Employment Policy Institute's 12th report on "The State of Working America" reveals that between 1973 and 2011 inflation-adjusted hourly wages fell nearly 30 percent for men without high school diplomas, 15 percent for high school graduates with no college, and 10 percent for men who started college but did not earn a degree. College graduates, by contrast, were earning 12 percent more, and wages for men with graduate degrees were up 30 percent.

According to a recent study by the Cleveland Federal Reserve Bank, labor's share of our gross national income has fallen from 65 percent in the 1980s to 58 percent today, meaning that a larger percentage of national income is going to the owners of capital, i.e., the owners of machines. Sachs and Kotlikoff worry that "machines, after all, are a form of capital, and the higher income they earn based on better machine brains may show up as a return to capital, not labor income."

The two economists devise an admittedly simple economic model in which skilled owners of smart machines reap most of the benefits of increased productivity and economic growth. This competition with the smart machines depresses the wages of young unskilled workers, limiting their ability to save and invest in skill acquisition and smart machines of their own. Thus each subsequent generation of young unskilled workers faces an economy in which ever less human and physical capital is available to them, further depressing their wages.

If the effect of technology on jobs really is different this time, what should be done? To prevent the "immiserizing" of young unskilled workers, Sachs and Kotlikoff argue, the government should tax away some of the "windfall" enjoyed by the owners of capital.

Brynjolfsson and McAfee prefer more spending on education. This is a puzzling recommendation, since they earlier note that the education sector "lags as an adopter of information technologies." Even more oddly, they do not wonder why that might be. (Two words: government monopoly.) They do, however, recognize that the rise of online schooling could have a big beneficial impact on labor-force skills. They also advocate policies such as aggressively lowering the barriers to business creation, resisting efforts to regulate hiring and firing, reducing payroll taxes, decoupling benefits from jobs, not rushing to regulate new network businesses, streamlining the patent system, and shortening copyright terms. Such sensible reforms should be adopted whether or not technological unemployment is a problem.

The Santa Fe Institute's Brian Arthur looks at the longer-term implications of the second economy. Smart machines will boost economic growth and prosperity indefinitely, but such an economy may not provide jobs. Until now, compensation for labor has been how people gained access to the growing prosperity that increasing productivity made possible. "The second economy will produce wealth no matter what we do," Arthur argues. "Distributing that wealth has become the main problem." Perhaps we will work less. After all, in 1900 Americans worked an average of 2,300 hours per year; now it's 1,800.

Or, as has happened so far, entirely new economic sectors could come into existence, providing work for future generations. In 1985 there were just 340,000 mobile phone subscribers in the United States; today there are more than 321 million. More broadly, increasing productivity lowers the prices of goods and services, leaving consumers more disposable income to spend on other goods and services. In 1950 American families spent 18 percent of their food budgets on dining out. Today they spend 40 percent. In 1972 the U.S. had one restaurant for every 430 Americans. Today it's one for every 320.

Instead of racing against the machines, Brynjolfsson and McAfee argue, we should race with them. Of course, as rising productivity shows, that's exactly what we have been doing. In the industrial era, machines largely complemented and substituted for human brawn; now they are complementing and substituting for human brains. To better race with increasingly smart machines, we will become more intimate with them by incorporating them into our bodies and brains.

Already polls reveal that most of us suffer from nomophobia, fear of being without our mobile phones. Babak Parviz, head of the Google Glass project, suggested in the February 2013 issue of Wired that digital displays will some day be incorporated into contact lenses. Such lenses would eliminate the need for displays on phones, computers, and TV sets while providing the wearer access to an augmented reality in which information is overlaid on whatever she is looking at. The stuttering first steps toward brain/computer interfaces are being taken today.

If governments take the good advice offered by Brynjolfsson and McAfee, we can have a future in which tens of millions of micro-entrepreneurs provide ever more specialized goods and services to ever more discerning consumers. The resulting prosperity could well free people to employ themselves in tackling scarcities in other facets of life, such as liberty, love, and time. I wouldn't want to break the machines that will make that possible.

0 views0 comments

Comments


bottom of page