Oxford University Press's
Academic Insights for the Thinking World

The invention of the information revolution

The idea that the US economy runs on information is so self-evident and commonly accepted today that it barely merits comment. There was an information revolution. America “stopped making stuff.” Computers changed everything. Everyone knows these things, because of an incessant stream of reinforcement from liberal intellectuals, corporate advertisers, and policymakers who take for granted that the US economy shifted toward an “knowledge-based” economy in the late twentieth century.

Indeed, the idea of the information revolution has gone beyond cliché to become something like conventional wisdom, or even a natural feature of the universe like the freezing point of water or the incalculable value of π. Yet this notion has a history of its own, rooted in a concerted effort by high-tech companies in the 1960s to inoculate the American public against fears that computing and automation would lead to widespread job loss. The information revolution did not spring from the skull of Athena or emerge organically from the mysterious workings of the economy. It was invented.

Let’s go back to the 1950s. A young journalist named Daniel Bell was working for Fortune magazine, at a time when magazines for rich people actually employed thoughtful, perceptive social critics. He was one of several observers who began to notice that a fundamental shift was sweeping the United States; its industrial base, the “arsenal of democracy” that won World War II against fascism, was shrinking relative to other sectors of the economy, at least as a portion of employment. Around the same time, economist Colin Clark hypothesized that there were different sectors of the economy: primary (meaning agricultural and extractive industries), secondary (manufacturing) and tertiary (everything else). Other economists came along and refined the concept to distinguish between different types of “services,” but it was clear that greater productivity in industry—gained from automating the process of production and, eventually, offshoring work to take advantage of cheaper labor in the developing world—meant that an increasing proportion of Americans worked in fields such as retail, healthcare, and education, relative to the manufacturing sector that had so defined the US economy in the early to mid-twentieth century.

computer
Student with vintage IBM computer. Technology by Brigham Young University – Idaho, David O McKay Library, Digital Asset Management. Public Domain via Digital Public Library of America

It was not just intellectuals such as Bell or Clark who realized this change was underway. The young radicals of the New Left recognized that the economy was undergoing a fundamental transformation, as pointed out in the seminal Port Huron Statement of 1962. The generation that was coming up in the universities realized that they would be taking charge of American life in offices and laboratories, as the real number of manufacturing workers had held steady relative to the growth of the rest of the economy since the late 1950s. They worried about a “remote control economy,” where workers and the unions that represented them were relatively diminished, and an incipient class of knowledge workers—a term that would not really gain currency until the 1970s—would hold a new sort of hegemonic power. In many ways, the New Left was the political expression of this class in its infancy.

But a new, postindustrial economy could have taken any number of forms. In fact, many commentators in the 1960s assumed that increasing automation would result in greater leisure and shorter working hours, since we could make more stuff with lesser work. It remained for high-tech firms such as IBM and RCA to set the terms for the way we understood the changing economy, and they did so with gusto. With the help of canny admen such as Marion Harper—who intoned in 1961, “to manage the future is to manage information”—they rolled out a public relations campaign that celebrated the “information revolution.” Computers would not kill jobs—they would create jobs. Computers would not result in a cold, impersonal, invasive new world, empowering governments and corporations at the expense of the little guy—they would make life more efficient and convenient.

Fears of technological unemployment were very real during the 1950s and 1960s. Few recall that unemployment rates in parts of the United States were as high as ten percent in the late 1950s, and Congressmen held hearings over the issue of jobs being displaced by new technology. At the same time, government—from the Post Office to the Pentagon—was the biggest buyer of computer technology by a wide margin. IBM needed government, and the US Census needed IBM to crunch its numbers. Hence, tech companies yearned to manage public opinion, and they enlisted the help of intellectuals such as media theorist Marshall McLuhan and anthropologist Margaret Mead to assuage anxieties about the implications of new technology. The latter contributed to a huge ad supplement taken out by the computer industry in the New York Times in 1965 called “The Information Revolution,” joining notables such as Secretary of Labor Willard Wirtz in an effort to explain that automation was not the enemy. IBM’s competitor RCA set up a major public exhibition in Manhattan’s Rockefeller Center in 1967 with the same theme.

In other words, the very concept of an “information revolution” was introduced and explicitly promoted by powerful tech firms at a time when Americans were worried about their jobs and new technology. Like Don Draper’s famous toasted tobacco, it came from Madison Avenue. They lent shape and direction to inchoate and confusing shift in the political economy of the United States, by dint of shrewd marketing. However, we might have interpreted things differently. The future, as seen from the 1960s, might not have depended on the prerogatives of Silicon Valley, Hollywood, and other progenitors of intellectual property, but we have by and large been convinced that the fate of the nation lies in the hands of scientists, screenwriters, engineers, and all the other people who make the coin of the realm: information. It is worth considering, though, that America once had to be persuaded to believe that information was the inevitable future toward which we are all inexorably heading. As left activists are fond of saying, another world is possible—and once was.

Featured image credit: ‘The Hypertext Editing System (HES) console in use at Brown University, circa October 1969 by Greg Lloyd’ by McZusatz. CC-BY-2.0 via Wikimedia Commons.

Recent Comments

There are currently no comments.