Turing's Cathedral Read online

Page 2


  Kathleen (née Britten) Booth (1922–): Computational physicist and member of J. D. Bernal’s Biomolecular Structure Group; visitor at the IAS ECP in 1947; author of Programming for an Automatic Digital Calculator (1958).

  Arthur W. Burks (1915–2008): American ENIAC (Electronic Numerical Integrator and Computer) project engineer, philosopher, logician, and “scribe” for the IAS preliminary design team in 1946.

  Vannevar Bush (1890–1974): Analog computer pioneer, director of the U.S. Office of Scientific Research and Development during World War II, and lead administrator of the Manhattan Project.

  Jule Gregory Charney (1917–1981): American meteorologist and leader of the IAS Meteorology Project between 1948 and 1956.

  Richard F. Clippinger (1913–1997): American mathematician and computer scientist; supervised the retrofit of the ENIAC to stored-program mode in 1947.

  Hewitt Crane (1927–2008): American electrical engineer and IAS ECP member, 1951–1954; subsequently lead scientist at Stanford Research Institute.

  Freeman J. Dyson (1923–): British American mathematical physicist; arrived at the IAS as a Commonwealth Fellow in September 1948.

  Carl Henry Eckart (1902–1973): American physicist, first director of Scripps Institution of Oceanography, and fourth husband of Klára (Klári) von Neumann.

  John Presper Eckert (1919–1995): American electronic engineer, ENIAC developer, and cofounder, with John Mauchly, of the Electronic Control Company, manufacturers of BINAC and UNIVAC.

  Akrevoe (née Kondopria) Emmanouilides (1929–): Administrative secretary to Herman Goldstine on the ENIAC project at the Moore School and on the IAS ECP during 1946–1949.

  Gerald Estrin (1921–): IAS ECP member 1950–1956, with a leave of absence to direct the construction of the WEIZAC, a first-generation sibling of the MANIAC, at the Weizmann Institute in Rehovot, Israel, 1953–1955.

  Thelma Estrin (1924–): Electronic engineer, member of the IAS ECP1950–1956, and wife of Gerald Estrin.

  Foster (1915–1999) and Cerda (1916–1988) Evans: Los Alamos physicists and husband-and-wife thermonuclear programming team; at the IAS in 1953 and 1954.

  Richard P. Feynman (1918–1988): American physicist and member of the wartime Los Alamos computing group.

  Abraham Flexner (1866–1959): American schoolteacher, educational reformer, and founding director of the IAS, 1930–1939.

  Simon Flexner (1863–1946): American philanthropist, Rockefeller Foundation officer, and older brother of Abraham Flexner.

  Stanley P. Frankel (1919–1978): American physicist, student of Robert Oppenheimer, and Los Alamos colleague of Richard Feynman; member of the original ENIAC and IAS thermonuclear calculation team; minicomputer design pioneer.

  Kurt Gödel (1906–1978): Moravian-born Austrian logician; arrived at the IAS in 1933.

  Herman Heine Goldstine (1913–2004): American mathematician, U.S. Army officer, ENIAC administrator, and associate director of the IAS ECP during 1946–1956.

  Irving John (Jack) Good (born Isadore Jacob Gudak; 1916–2009): British American Bayesian statistician, artificial intelligence pioneer, cryptologist, and assistant to Alan Turing during the British code-breaking effort in World War II.

  Leslie Richard Groves (1896–1970): U.S. Army general, commander of Los Alamos during World War II, and, later, research director at Remington Rand.

  Verena Huber-Dyson (1923–): Swiss American logician and group theorist; arrived at the IAS as a postdoctoral fellow in 1948.

  James Brown Horner (Desmond) Kuper (1909–1992): American physicist and second husband of Mariette (Kovesi) von Neumann.

  Herbert H. Maass (1878–1957): Attorney and founding trustee of the IAS.

  Benoît Mandelbrot (1924–2010): Polish-born French American mathematician; invited by von Neumann to the IAS to study word frequency distributions in 1953.

  John W. Mauchly (1907–1980): American physicist, electrical engineer, and cofounder of the ENIAC project.

  Harris Mayer (1921–): American Manhattan Project physicist and collaborator with Edward Teller and John von Neumann.

  Richard W. Melville (1914–1994): Lead mechanical engineer for the IAS ECP, 1948–1953.

  Nicholas Constantine Metropolis (1915–1999): Greek American mathematician and computer scientist, early proponent of the Monte Carlo method, and leader of Los Alamos computing group.

  Bernetta Miller (1884–1972): Pioneer aviatrix; administrative assistant at the IAS, 1941–1948.

  Oskar Morgenstern (1902–1977): Austrian American economist, coauthor of Theory of Games and Economic Behavior (1944).

  Harold Calvin (Marston) Morse (1892–1977): American mathematician; sixth professor to be hired at the IAS.

  Maxwell Herman Alexander Newman (1897–1984): British topologist, computer pioneer, and mentor to Alan Turing.

  J. Robert Oppenheimer (1904–1967): Physicist; Los Alamos National Laboratory director during World War II and director of the IAS, 1947–1966.

  William Penn (1644–1718): Quaker agitator and son of Admiral Sir William Penn (1621–1670); founder of Pennsylvania and early proprietor of the land on which the IAS was later built.

  James Pomerene (1920–2008): American electronic engineer; member of the IAS ECP, 1946–1955; replaced Julian Bigelow in 1951 as chief engineer.

  Irving Nathaniel Rabinowitz (1929–2005): Astrophysicist and computer scientist; member of the IAS ECP, 1954–1957.

  Jan Rajchman (1911–1989): Polish American electronic engineer; inventor of resistor-matrix storage and RCA’s Selectron memory tube.

  Lewis Fry Richardson (1881–1953): British pacifist, mathematician, electrical engineer, and early proponent of numerical weather prediction.

  Robert Richtmyer (1910–2003): American mathematical physicist and nuclear weapons design pioneer.

  Jack Rosenberg (1921–): American electronic engineer and IAS ECP member 1947–1951.

  Morris Rubinoff (1917–2003): Canadian American physicist and electronic engineer; IAS ECP member, 1948–1949.Martin Schwarzschild (1912–1997): German American astrophysicist and developer of early stellar evolution codes.

  Atle Selberg (1917–2007): Norwegian American number theorist; arrived at the IAS in 1947.

  Hedvig (Hedi; née Liebermann) Selberg (1919–1995): Transylvanianborn mathematics and physics teacher; wife of Atle Selberg, collaborator with Martin Schwarzschild, and lead coder for the IASECP.

  Claude Elwood Shannon (1916–2001): American mathematician, electrical engineer, and pioneering information theorist; visiting member of the IAS (1940–1941).

  Ralph Slutz (1917–2005): American physicist and member of the IAS ECP, 1946–1948; supervised construction of the SEAC (Standards Eastern Automatic Computer), the first of the IAS-class designs to become operationally complete.

  Joseph Smagorinsky (1924–2005): American meteorologist; at the IAS, 1950–1953.

  Lewis L. Strauss (1896–1974): American naval officer, businessman, IAS trustee, and head of U.S. Atomic Energy Commission.

  Leó Szilárd (1898–1964): Hungarian American physicist, reluctant nuclear weapon pioneer, and author of The Voice of the Dolphins.

  Edward Teller (1908–2003): Hungarian American physicist and leading advocate of the hydrogen (or “super”) bomb.

  Philip Duncan Thompson (1922–1994): U.S. Air Force meteorological liaison officer, assigned to the IAS ECP, 1948–1949.

  Bryant Tuckerman (1915–2002): American topologist and computer scientist; member of the IAS ECP, 1952–1957.

  John W. Tukey (1915–2000): American statistician at Princeton University and Bell Labs; coined the term “bit.”

  Alan Mathison Turing (1912–1954): British logician and cryptologist; author of “On Computable Numbers” (1936).

  Françoise (née Aron) Ulam (1918–2011): French American editor and journalist; wife of Stanislaw Ulam.

  Stanislaw Marcin Ulam (1909–1984): Polish American mathematician and protégé of John von Neumann.


  Oswald Veblen (1880–1960): American mathematician, nephew of Thorstein Veblen, and first professor appointed to the IAS in 1932.

  Theodore von Kármán (1881–1963): Hungarian American aerodynamicist, founder of Jet Propulsion Laboratory (JPL).

  John von Neumann (born Neumann János; 1903–1957): Hungarian American mathematician; fourth professor appointed to the IAS, in 1933; founder of the IAS ECP.

  Klára (née Dán) von Neumann (1911–1963): Second wife of John von Neumann; married in 1938.

  Margit (née Kann) von Neumann (1880–1956): Mother of John von Neumann.

  Mariette (née Kovesi) von Neumann (1909–1992): First wife of John von Neumann; married in 1929.

  Max von Neumann (born Neumann Miksa; 1873–1928): Investment banker, lawyer, and father of John von Neumann.

  Michael von Neumann (born Neumann Mihály; 1907–1989): Physicist, and younger brother of John von Neumann.

  Nicholas Vonneumann (born Neumann Miklos; 1911–2011): Patent attorney, and youngest brother of John von Neumann.

  Willis H. Ware (1920–): American electrical engineer and member of the IAS ECP, 1946–1951, subsequently at RAND.

  Warren Weaver (1894–1978): American mathematician, self-described “Chief Philanthropoid” at the Rockefeller Foundation, and director of the Applied Mathematics Panel of the U.S. Office of Scientific Research and Development during World War II.

  Marina (née von Neumann) Whitman (1935–): Economist, U.S. presi-dential adviser, and daughter of John von Neumann and Mariette Kovesi von Neumann.

  Norbert Wiener (1894–1964): American mathematician and founder, with Julian Bigelow and John von Neumann, of what would become known as the Cybernetics Group.

  Eugene P. Wigner (born Wigner Jeno; 1902–1995): Hungarian American mathematical physicist.

  Frederic C. Williams (1911–1977): British electronic engineer; World War II radar pioneer and developer, at the University of Manchester, of the “Williams” cathode-ray storage tube, and of the Manchester “Mark 1,” the first operational stored-program computer to utilize it.

  Vladimir Kosma Zworykin (1889–1982): Russian-born American television pioneer and director of the Princeton laboratories of RCA.

  ONE

  1953

  If it’s that easy to create living organisms, why don’t you create a few yourself?

  —Nils Aall Barricelli, 1953

  AT 10:38 P.M. on March 3, 1953, in a one-story brick building at the end of Olden Lane in Princeton, New Jersey, Italian Norwegian mathematical biologist Nils Aall Barricelli inoculated a 5-kilobyte digital universe with random numbers generated by drawing playing cards from a shuffled deck. “A series of numerical experiments are being made with the aim of verifying the possibility of an evolution similar to that of living organisms taking place in an artificially created universe,” he announced.1

  A digital universe—whether 5 kilobytes or the entire Internet—consists of two species of bits: differences in space, and differences in time. Digital computers translate between these two forms of information—structure and sequence—according to definite rules. Bits that are embodied as structure (varying in space, invariant across time) we perceive as memory, and bits that are embodied as sequence (varying in time, invariant across space) we perceive as code. Gates are the intersections where bits span both worlds at the moments of transition from one instant to the next.

  The term bit (the contraction, by 40 bits, of “binary digit”) was coined by statistician John W. Tukey shortly after he joined von Neumann’s project in November of 1945. The existence of a fundamental unit of communicable information, representing a single distinction between two alternatives, was defined rigorously by information theorist Claude Shannon in his then-secret Mathematical Theory of Cryptography of 1945, expanded into his Mathematical Theory of Communication of 1948. “Any difference that makes a difference” is how cybernetician Gregory Bateson translated Shannon’s definition into informal terms.2 To a digital computer, the only difference that makes a difference is the difference between a zero and a one.

  That two symbols were sufficient for encoding all communication had been established by Francis Bacon in 1623. “The transposition of two Letters by five placeings will be sufficient for 32 Differences [and] by this Art a way is opened, whereby a man may expresse and signifie the intentions of his minde, at any distance of place, by objects … capable of a twofold difference onely,” he wrote, before giving examples of how such binary coding could be conveyed at the speed of paper, the speed of sound, or the speed of light.3

  That zero and one were sufficient for logic as well as arithmetic was established by Gottfried Wilhelm Leibniz in 1679, following the lead given by Thomas Hobbes in his Computation, or Logique of 1656. “By Ratiocination, I mean computation,” Hobbes had announced. “Now to compute, is either to collect the sum of many things that are added together, or to know what remains when one thing is taken out of another. Ratiocination, therefore is the same with Addition or Substraction; and if any man adde Multiplication and Division, I will not be against it, seeing … that all Ratiocination is comprehended in these two operations of the minde.”4 The new computer, for all its powers, was nothing more than a very fast adding machine, with a memory of 40,960 bits.

  In March of 1953 there were 53 kilobytes of high-speed random-access memory on planet Earth.5 Five kilobytes were at the end of Olden Lane, 32 kilobytes were divided among the eight completed clones of the Institute for Advanced Study’s computer, and 16 kilobytes were unevenly distributed across a half dozen other machines. Data, and the few rudimentary programs that existed, were exchanged at the speed of punched cards and paper tape. Each island in the new archipelago constituted a universe unto itself.

  In 1936, logician Alan Turing had formalized the powers (and limitations) of digital computers by giving a precise description of a class of devices (including an obedient human being) that could read, write, remember, and erase marks on an unbounded supply of tape. These “Turing machines” were able to translate, in both directions, between bits embodied as structure (in space) and bits encoded as sequences (in time). Turing then demonstrated the existence of a Universal Computing Machine that, given sufficient time, sufficient tape, and a precise description, could emulate the behavior of any other computing machine. The results are independent of whether the instructions are executed by tennis balls or electrons, and whether the memory is stored in semiconductors or on paper tape. “Being digital should be of more interest than being electronic,” Turing pointed out.6

  Von Neumann set out to build a Universal Turing Machine that would operate at electronic speeds. At its core was a 32-by-32-by-40-bit matrix of high-speed random-access memory—the nucleus of all things digital ever since. “Random access” meant that all individual memory locations—collectively constituting the machine’s internal “state of mind”—were equally accessible at any time. “High speed” meant that the memory was accessible at the speed of light, not the speed of sound. It was the removal of this constraint that unleashed the powers of Turing’s otherwise impractical Universal Machine.

  Electronic components were widely available in 1945, but digital behavior was the exception to the rule. Images were televised by scanning them into lines, not breaking them into bits. Radar delivered an analog display of echoes returned by the continuous sweep of a microwave beam. Hi-fi systems filled postwar living rooms with the warmth of analog recordings pressed into vinyl without any losses to digital approximation being introduced. Digital technologies—Teletype, Morse code, punched card accounting machines—were perceived as antiquated, low-fidelity, and slow. Analog ruled the world.

  The IAS group achieved a fully electronic random-access memory by adapting analog cathode-ray oscilloscope tubes—evacuated glass envelopes about the size and shape of a champagne bottle, but with walls as thin as a champagne flute’s. The wide end of each tube formed a circular screen with a fluorescent internal coating, and at the narrow end was a h
igh-voltage gun emitting a stream of electrons whose aim could be deflected by a two-axis electromagnetic field. The cathode-ray tube (CRT) was a form of analog computer: varying the voltages to the deflection coils varied the path traced by the electron beam. The CRT, especially in its incarnation as an oscilloscope, could be used to add, subtract, multiply, and divide signals—the results being displayed directly as a function of the amplitude of the deflection and its frequency in time. From these analog beginnings, the digital universe took form.

  Applying what they had learned in the radar, cryptographic, and antiaircraft fire-control business during the war, von Neumann’s engineers took pulse-coded control of the deflection circuits and partitioned the face of the tube into a 32-by-32 array of numerically addressable locations that could be individually targeted by the electron beam. Because the resulting electric charge lingered on the coated glass surface for a fraction of a second and could be periodically refreshed, each 5-inch-diameter tube could store 1,024 bits of information, with the state of any specified location accessible at any time. The transition from analog to digital had begun.

  The IAS computer incorporated forty cathode-ray memory tubes, with memory addresses assigned as if a desk clerk were handing out similar room numbers to forty guests at a time in a forty-floor hotel. Codes proliferated within this universe by taking advantage of the architectural principle that a pair of 5-bit coordinates (25 = 32) uniquely identified one of 1,024 memory locations containing a string (or “word”) of 40 bits. In 24 microseconds, any specified 40-bit string of code could be retrieved. These 40 bits could include not only data (numbers that mean things) but also executable instructions (numbers that do things)—including instructions to modify the existing instructions, or transfer control to another location and follow new instructions from there.

  Since a 10-bit order code, combined with 10 bits specifying a memory address, returned a string of 40 bits, the result was a chain reaction analogous to the two-for-one fission of neutrons within the core of an atomic bomb. All hell broke loose as a result. Random-access memory gave the world of machines access to the powers of numbers—and gave the world of numbers access to the powers of machines.