Skip to main content

What Computer Innovation and Fracking Have in Common [Excerpt]

In this excerpt from his new book, The Innovators, Walter Isaacson explores the origin of new technologies and the nature of new ideas

Excerpted with permission from The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution, by Walter Isaacson. Published by Simon & Schuster, Inc. Printed by permission. Copyright © 2014, by Walter Isaacson.

Sometimes innovation is a matter of timing. A big idea comes along at just the moment when the technology exists to implement it. For example, the idea of sending a man to the moon was proposed right when the progress of microchips made it possible to put computer guidance systems into the nose cone of a rocket. There are other cases, however, when the timing is out of kilter. Charles Babbage published his paper about a sophisticated computer in 1837, but it took a hundred years to achieve the scores of technological advances needed to build one.

Some of those advances seem almost trivial, but progress comes not only in great leaps but also from hundreds of small steps. Take for example punch cards, like those Babbage saw on [weaving] looms and proposed incorporating into his Analytical Engine. Perfecting the use of punch cards for computers came about because Herman Hollerith, an employee of the U.S. Census Bureau, was appalled that it took close to eight years to manually tabulate the 1880 census. He resolved to automate the 1890 count.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Drawing on the way that railway conductors punched holes in various places on a ticket in order to indicate the traits of each passenger (gender, approximate height, age, hair color), Hollerith devised punch cards with twelve rows and twenty-four columns that recorded the salient facts about each person in the census. The cards were then slipped between a grid of mercury cups and a set of spring-loaded pins, which created an electric circuit wherever there was a hole. The machine could tabulate not only the raw totals but also combinations of traits, such as the number of married males or foreign-born females. Using Hollerith’s tabulators, the 1890 census was completed in one year rather than eight. It was the first major use of electrical circuits to process information, and the company that Hollerith founded became in 1924, after a series of mergers and acquisitions, the International Business Machines Corporation, or IBM.

One way to look at innovation is as the accumulation of hundreds of small advances, such as counters and punch-card readers. At places like IBM, which specialize in daily improvements made by teams of engineers, this is the preferred way to understand how innovation really happens. Some of the most important technologies of our era, such as the fracking techniques developed over the past six decades for extracting natural gas, came about because of countless small innovations as well as a few breakthrough leaps.

In the case of computers, there were many such incremental advances made by faceless engineers at places like IBM. But that was not enough. Although the machines that IBM produced in the early twentieth century could compile data, they were not what we would call computers. They weren’t even particularly adroit calculators. They were lame. In addition to those hundreds of minor advances, the birth of the computer age required some larger imaginative leaps from creative visionaries.

The machines devised by Hollerith and Babbage were digital, meaning they calculated using digits: discrete and distinct integers such as 0, 1, 2, 3. In their machines, the integers were added and subtracted using cogs and wheels that clicked one digit at a time, like counters. Another approach to computing was to build devices that could mimic or model a physical phenomenon and then make measurements on the analogous model to calculate the relevant results. These were known as analog computers because they worked by analogy. Analog computers do not rely on discrete integers to make their calculations; instead, they use continuous functions. In analog computers, a variable quantity such as electrical voltage, the position of a rope on a pulley, hydraulic pressure, or a measurement of distance is employed as an analog for the corresponding quantities of the problem to be solved. A slide rule is analog; an abacus is digital. Clocks with sweeping hands are analog, and those with displayed numerals are digital.

Around the time that Hollerith was building his digital tabulator, Lord Kelvin and his brother James Thomson, two of England’s most distinguished scientists, were creating an analog machine. It was designed to handle the tedious task of solving differential equations, which would help in the creation of tide charts and of tables showing the firing angles that would generate different trajectories of artillery shells. Beginning in the 1870s, the brothers devised a system that was based on a planimeter, an instrument that can measure the area of a two-dimensional shape, such as the space under a curved line on a piece of paper. The user would trace the outline of the curve with the device, which would calculate the area by using a small sphere that was slowly pushed across the surface of a large rotating disk. By calculating the area under the curve, it could thus solve equations by integration—in other words, it could perform a basic task of calculus. Kelvin and his brother were able to use this method to create a “harmonic synthesizer” that could churn out an annual tide chart in four hours. But they were never able to conquer the mechanical difficulties of linking together many of these devices in order to solve equations with a lot of variables.

Innovation occurs when ripe seeds fall on fertile ground. Instead of having a single cause, the great advances of 1937 came from a combination of capabilities, ideas, and needs that coincided in multiple places. As often happens in the annals of invention, especially information technology invention, the time was right and the atmosphere was charged. The development of vacuum tubes for the radio industry paved the way for the creation of electronic digital circuits. That was accompanied by theoretical advances in logic that made circuits more useful. And the march was quickened by the drums of war. As nations began arming for the looming conflict, it became clear that computational power was as important as firepower. Advances fed on one another, occurring almost simultaneously and spontaneously, at Harvard and MIT and Princeton and Bell Labs and an apartment in Berlin and even, most improbably but interestingly, in a basement in Ames, Iowa.

One of these leaps led to the formal concept of a “universal computer,” a general-purpose machine that could be programmed to perform any logical task and simulate the behavior of any other logical machine. It was conjured up as a thought experiment by a brilliant English mathematician with a life story that was both inspiring and tragic: Alan Turing.

Walter Isaacson is CEO of the Aspen Institute. He was chairman of CNN and managing editor of Time magazine. Isaacson is author of numerous books, including Steve Jobs (Simon

More by Walter Isaacson