27 March 2019 – A reader of last week’s column, in which I reported recent opinions voiced by a few automation experts at February’s Conference on the Future of Work held at at Stanford University, informed me of a chapter from Henry Hazlitt’s 1988 book Economics in One Lesson that Australian computer scientist Steven Shaw uploaded to his blog.
I’m not going to get into the tangled web of potential copyright infringement that Shaw’s posting of Hazlitt’s entire text opens up, I’ve just linked to the most convenient-to-read posting of that particular chapter. If you follow the link and want to buy the book, I’ve given you the appropriate link as well.
The chapter is of immense value apropos the question of whether automation generally reduces the need for human labor, or creates more opportunities for humans to gain useful employment. Specifically, it looks at the results of a number of historic events where Luddites excoriated technology developers for taking away jobs from humans only to have subsequent developments prove them spectacularly wrong.
Hazlitt’s classic book is, not surprisingly for a classic, well documented, authoritative, and extremely readable. I’m not going to pretend to provide an alternative here, but to summarize some of the chapter’s examples in the hope that you’ll be intrigued enough to seek out the original.
Luddism
Before getting on to the examples, let’s start by looking at the history of Luddism. It’s not a new story, really. It probably dates back to just after cave guys first thought of specialization of labor.
That is, sometime in the prehistoric past, some blokes were found to be especially good at doing some things, and the rest of the tribe came up with the idea of letting, say, the best potters make pots for the whole tribe, and everyone else rewarding them for a job well done by, say, giving them choice caribou parts for dinner.
Eventually, they had the best flint knappers make the arrowheads, the best fletchers put the arrowheads on the arrows, the best bowmakers make the bows, and so on. Division of labor into different jobs turned out to be so spectacularly successful that very few of us rugged individualists, who pretend to do everything for ourselves, are few and far between (and are largely kidding ourselves, anyway).
Since then, anyone who comes up with a great way to do anything more efficiently runs the risk of having the folks who spent years learning to do it the old way land on him (or her) like a ton of bricks.
It’s generally a lot easier to throw rocks to drive the innovator away than to adapt to the innovation.
Luddites in the early nineteenth century were organized bands of workers who violently resisted mechanization of factories during the late Industrial Revolution. Named for an imaginary character, Ned Ludd, who was supposedly an apprentice who smashed two stocking frames in 1779 and whose name had become emblematic of machine destroyers. The term “Luddite” has come to mean anyone fanatically opposed to deploying advanced technology.
Of course, like religious fundamentalists, they have to pick a point in time to separate “good” technology from the “bad.” Unlike religious fanatics, who generally pick publication of a certain text to be the dividing line, Luddites divide between the technology of their immediate past (with which they are familiar) and anything new or unfamiliar. Thus, it’s a continually moving target.
In either case, the dividing line is fundamentally arbitrary, so the emotion of their response is irrational. Irrationality typically carries a warranty of being entirely contrary to facts.
What Happens Next
Hazlitt points out, “The belief that machines cause unemployment, when held with any logical consistency, leads to preposterous conclusions.” He points out that on the second page of the first chapter of Adam Smith’s seminal book Wealth of Nations, Smith tells us that a workman unacquainted with the use of machinery employed in sewing-pin-making “could scarce make one pin a day, and certainly could not make twenty,” but with the use of the machinery he can make 4,800 pins a day. So, zero-sum game theory would indicate an immediate 99.98 percent unemployment rate in the pin-making industry of 1776.
Did that happen? No, because economics is not a zero-sum game. Sewing pins went from dear to cheap. Since they were now cheap, folks prized them less and discarded them more (when was the last time you bothered to straighten a bent pin?), and more folks could afford to buy them in the first place. That led to an increase in sewing-pin sales as well as sales of things like sewing-patterns and bulk fine fabric sold to amateur sewers, and more employment, not less.
Similar results obtained in the stocking industry when new stocking frames (the original having been invented William Lee in 1589, but denied a patent by Elizabeth I who feared its effects on employment in hand-knitting industries) were protested by Luddites as fast as they could be introduced. Before the end of the nineteenth century the stocking industry was employing at least a hundred men for every man it employed at the beginning of the century.
Another example Hazlitt presents from the Industrial Revolution happened in the cotton-spinning industry. He says: “Arkwright invented his cotton-spinning machinery in 1760. At that time it was estimated that there were in England 5,200 spinners using spinning wheels, and 2,700 weavers—in all, 7,900 persons engaged in the production of cotton textiles. The introduction of Arkwright’s invention was opposed on the ground that it threatened the livelihood of the workers, and the opposition had to be put down by force. Yet in 1787—twenty-seven years after the invention appeared—a parliamentary inquiry showed that the number of persons actually engaged in the spinning and weaving of cotton had risen from 7,900 to 320,000, an increase of 4,400 percent.”
As these examples indicate, improvements in manufacturing efficiency generally lead to reductions in manufacturing cost, which, when passed along to customers, reduces prices with concommitent increases in unit sales. This is the price elasticity of demand curve from Microeconomics 101. It is the reason economics is decidedly not a zero-sum game.
If we accept economics as not a zero-sum game, predicting what happens when automation makes it possible to produce more stuff with fewer workers becomes a chancy proposition. For example, many economists today blame flat productivity (the amount of stuff produced divided by the number of workers needed to produce it) for lack of wage gains in the face of low unemployment. If that is true, then anything that would help raise productivity (such as automation) should be welcome.
Long experience has taught us that economics is a positive-sum game. In the face of technological advancement, it behooves us to expect positive outcomes while taking measures to ensure that the concomitant economic gains get distributed fairly (whatever that means) throughout society. That is the take-home lesson from the social dislocations that accompanied the technological advancements of the Early Industrial Revolution.