Technology Feels Like It’s Accelerating—Because You’ve Been Watching Too Many TED Talks

Robert D. Atkinson November 1, 2019
November 1, 2019

(Ed. Note: The “Innovation Fact of the Week” appears as a regular feature in each edition of ITIF’s weekly email newsletter. Sign up today.)

It’s increasingly common these days for culture watchers, business soothsayers, and academics to premise commentary pieces and TED Talks on the widely accepted idea that “the pace of change is accelerating.” To emphasize the point, some tack on the claim that it is doing so “exponentially.”

This consensus has been building for several years, which may stand to reason since readers and watchers are likely as not to consume the commentary on smartphones more powerful than the personal computers of the past. As one headline breathlessly declared in 2016: “Technology Feels Like It’s Accelerating—Because It Actually Is.”

Except that, no, actually it is not. In fact, compared to some prior periods, the pace of change today is likely slower. Why does this matter? Because beliefs about an ever-accelerating pace of technological change provide fuel for Neo-Luddite, anti-technology fires. If the pace of change really were unprecedented, then conventional wisdom holds we’d better darn well slow it down, so no one gets hurt.

Either way, the commentators warn, “buckle up.” Here is one:

It has become a cliché to say that what we are now living through is a “second industrial revolution.” This phrase is supposed to impress us with the speed and profundity of the change around us. But in addition to being platitudinous, it is misleading. For what is occurring now is, in all likelihood, bigger, deeper, and more important than the industrial revolution. Indeed, a growing body of reputable opinion asserts that the present movement represents nothing less than the second great divide in human history, comparable in magnitude only with that first great break in historic continuity, the shift from barbarism to civilization.

And another:

One of the things that sets the second machine age apart is how quickly that second half of the chessboard can arrive. We’re not claiming that no other technology has ever improved exponentially… But the exponents were relatively small, so it only went through about three or four doublings in efficiency during that period. It would take a millennium to reach the second half of the chessboard at that rate. In the second machine age, the doublings happen much faster and exponential growth is much more salient.

For some perspective on today’s commentary, consider that the first of those two passages was written by futurist Alvin Toffler in 1970. The second was penned by MIT professors Erik Brynjolfsson and Andrew McAfee in 2014. Yet they sound similar, don’t they?

People have long believed that their eras were the ones where the pace of change suddenly became unprecedented. Indeed, when Henry Adams viewed the huge dynamo for producing electricity that was on display at the 1900 Great Exhibition in Paris, he was so awestruck that he described the sensation of having his “historical neck broken by the sudden irruption of forces totally new.” Indeed, this was all part and parcel of the “New Century Fever” that resulted from the turn-of-the-century technology revolution.

In our time, there has been a new surge in claims that change is accelerating so fast that we’re surely about to go off the rails. In 2001, futurist Ray Kurzweil wrote that every decade our overall rate of progress doubles. “We won’t experience 100 years of progress in the 21st century,” he posited—at today’s rate, “it will be more like 20,000 years of progress.” So, according to this framing, the pace of change in the coming decade will be four times faster than in the first decade of the 2000s, the 2030s will be eight times faster, etc. World Economic Forum President’s Klaus Schwab coined the catchy phrase “Fourth Industrial Revolution,” claiming it will affect “the very essence of our human experience.” The McKinsey Global Institute estimates that, compared to the Industrial Revolution of the late 18th and early 19th centuries, “change is happening ten times faster and at 300 times the scale, or roughly 3,000 times the impact.

Why have people long believed that their eras were unprecedented when it came to the rate of change? There are two reasons. First, at least today, it is hard to get attention if you say that “there’s nothing new here, at least in terms of the pace of change.” But if you throw out terms like “second machine age,” “exponential change,” and “Fourth Industrial Revolution,” they you are sure to get attention. (Google turns up 2.7 million hits for “fourth industrial revolution.”)

Second, it’s simply human nature. Most of us overestimate change in a few things around our lives and ignore most of the rest that changes very slowly, if at all.

The aforementioned smartphone is responsible for much of this dynamic today. Indeed, when pundits tout “exponential change,” they often hold up their phones as evidence. To be sure, the smartphone was a big, transformative innovation. But at one level, it was simply an advance in computing platforms that have been evolving steadily over the last 40 or 50 years. And while they are wonderful devices, they can’t drive my car, cook my meals, or take care of an elderly parent. Moreover, the rate of change in smartphones is slowing, with new versions now only incrementally better than prior ones.

Another factor is Moore’s law (which says the power of computing will double every 18 to 24 months) and its related cousins, Koomeys Law (the amount of battery needed for a fixed computing load will fall by a factor of 2 every 18 months) and Kryder’s Law (the amount of data storable in a given space will double every two years). There is no doubt that these kinds of exponential improvements are remarkable (and very hard to achieve), but a doubling in hardware capabilities doesn’t mean a doubling of innovation. Case in point: A computer that it twice as fast is certainly better than the one it replaced, but it is not twice as valuable; it is only incrementally more valuable, because it is still a computer that does what it did before, only for some applications bit faster.

Moreover, there are reasons to believe that the pace of these exponential improvements is slowing down. Jensen Huang, the CEO of Nvidia, a leading graphics-chip maker, recently stated that “Moore’s Law isn’t possible anymore.” It’s getting much harder, at the very least. As researchers Bloom, Jones, Van Reenen and Webb found, “the number of researchers required to double chip density today is more than 18 times larger than the number required in the early 1970s.”

If technological change actually were speeding up, one would expect to see an increase in patents and productivity, but we don’t. From 2006 to 2015 (the most recent data available), U.S. utility patents increased just 3.8 percent per year. Likewise, U.S. and EU productivity growth rates are extremely low (see figure 1, per Bloom, Jones, Van Reenen and Webb).

Figure 1: EU-15 and U.S. average annual labor productivity growth, 1980–2017

Figure 1: EU-15 and U.S. average annual labor productivity growth, 1980–2017

One would also see much faster progress in a host of technology areas. To be sure, some recent technologies like the Internet, social media usage, and cell phones have shown rapid adoption rates, but so did some past technologies. The radio went from being adopted by 10 percent of U.S. households in 1925 to 68 percent a decade later. Televisions were in half of American homes eight years after they were commercialized, according to historical statistics from the U.S. Department of Commerce.

Many technologies today are nowhere near penetrating half of American homes a decade after they were first commercialized. As David Moschella writes in Seeing Digital, home robots were introduced in 2002, wearable Fitbits in 2007, consumer 3D printers in 2010, and VR-3D goggles in 2015, and none are anywhere near 50 percent penetration.

A cursory look at the technologies on past Gartner “hype cycles” reinforces this point. In 2009, they predicted that mesh networks, home health monitoring, RFID technology products, 3D printing, 3D flat-panel displays, and mobile robots all were poised for takeoff. A decade later, none had done so. In 2000, Gartner listed biometrics, quantum computing, 3D Web, micropayments and grid computing. Two decades later, none of those technologies are in widespread use, either.

Enthusiasts today nonetheless point to technologies like the telephone, which took longer to adopt than, say, the Internet or the cell phone. But one big problem with historical comparisons of technology adoption rates is that when most of the population is low-income, as U.S. residents were when the telephone was commercialized it doesn’t matter how cool a technology is; it will take a long time to get adopted. In contrast, most people in today’s economy, even poor households, have some discretionary income that lets them buy many of the latest innovations, perhaps not as first adopters when prices are high, but relatively soon after.

As such, rather than look at adoption rates, it is better to look at development rates. And here today’s era certainly does not stand out as any more remarkable than past eras. Consider that between the late 1890s and the early 1920s, America was transformed with subways, electric lighting, skyscrapers and elevators, airplanes, the assembly line, automobiles, the sewing machine, and countless other inventions. Just look at the difference between a typical urban street in 1900 and in 1920 (figure 2). Motor vehicle production went from 4,192 units in 1900 to 3.6 million in 1923. There were just 8,000 motor vehicles registered in the United States in 1900, according to the Census, but 22 million in 1926. Electric utilities produced just 2 billion kilowatt hours in 1902 but 69 billion by 1926. There were just 2.3 million electrical household appliances produced in 1900, the Census found, but 84 million in 1919. Talk about a rapid pace of change.

Figure 2: U.S. street scenes in 1900 (left) and 1920 (right)

Figure 2: U.S. street scenes in 1900 (left) and 1920 (right)

None of this is to say that technology-driven change isn’t happening. Of course it is—and it’s making our lives much better. But the pace of change appears to be no faster than in prior eras, and just as economies did fine despite Luddite impulses then, ours will do fine now. So, let’s all take a deep breath and say together: “Technological change is not accelerating, but it would sure be nice if it would.”