Turing's Cathedral Read online




  Copyright © 2012 by George Dyson

  All rights reserved. Published in the United States by Pantheon Books, a division of Random House, Inc., New York, and in Canada by Random House of Canada Limited, Toronto.

  Pantheon Books and colophon are registered trademarks of Random House, Inc.

  Library of Congress Cataloging-in-Publication Data

  Dyson, George, [date]

  Turing’s cathedral : the origins of the digital universe/George Dyson.

  p. cm.

  Includes index.

  Summary: “In a revealing account of John von Neumann’s realization of Alan Turing’s Universal Machine, George Dyson vividly illuminates the nature of digital computers, the lives of those who brought them into existence, and how code took over the world.”

  eISBN: 978-0-307-90706-6

  1. Computers—History. 2. Turing machines. 3. Computable functions. 4. Random access memory. 5. Von Neumann, John, 1903–1957. 6. Turing, Alan Mathison, 1912–1954. I. Title.

  QA76.17.D97 2012 004’.09—dc23 2011030265

  www.pantheonbooks.com

  Cover design by Peter Mendelsund

  v3.1

  It was not made for those who sell oil or sardines…

  —G. W. Leibniz

  CONTENTS

  Cover

  Title Page

  Copyright

  Dedication

  Preface

  Acknowledgments

  Principal Characters

  1. 1953

  2. Olden Farm

  3. Veblen’s Circle

  4. Neumann János

  5. MANIAC

  6. Fuld 219

  7. 6J6

  8. V-40

  9. Cyclogenesis

  10. Monte Carlo

  11. Ulam’s Demons

  12. Barricelli’s Universe

  13. Turing’s Cathedral

  14. Engineer’s Dreams

  15. Theory of Self-Reproducing Automata

  16. Mach 9

  17. The Tale of the Big Computer

  18. The Thirty-ninth Step

  Key to Archival Sources

  Notes

  Index

  Illustrations

  About the Author

  Other Books by This Author

  PREFACE

  POINT SOURCE SOLUTION

  I am thinking about something much more important than bombs. I am thinking about computers.

  —John von Neumann, 1946

  There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.

  In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.

  Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.

  Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”1

  As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.

  Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions … it may be justified to treat the original, central, high pressure area as a point.”2 This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.

  Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.

  Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.

  Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.

  Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.

  It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.

  ACKNOWLEDGMENTS

  IN THE BEGINNING WAS THE COMMAND LINE

  Intuition of truth may not Relish soe much as Truth that is hunt
ed downe.

  —Sir Robert Southwell to William Petty, 1687

  In 1956, at the age of three, I was walking home with my father, physicist Freeman Dyson, from his office at the Institute for Advanced Study in Princeton, New Jersey, when I found a broken fan belt lying in the road. I asked my father what it was. “It’s a piece of the sun,” he said.

  My father was a field theorist, and protégé of Hans Bethe, former wartime leader of the Theoretical Division at Los Alamos, who, when accepting his Nobel Prize for discovering the carbon cycle that fuels the stars, explained that “stars have a life cycle much like animals. They get born, they grow, they go through a definite internal development, and finally they die, to give back the material of which they are made so that new stars may live.”1 To an engineer, fan belts exist between the crankshaft and the water pump. To a physicist, fan belts exist, briefly, in the intervals between stars.

  At the Institute for Advanced Study, more people worked on quantum mechanics than on their own cars. There was one notable exception: Julian Bigelow, who arrived at the Institute, in 1946, as John von Neumann’s chief engineer. Bigelow, who was fluent in physics, mathematics, and electronics, was also a mechanic who could explain, even to a three-year-old, how a fan belt works, why it broke, and whether it came from a Ford or a Chevrolet.

  A child of the Depression, Bigelow never threw anything away. The Institute for Advanced Study, occupying the site of the former Olden Farm, owned a large, empty barn, where surplus parts and equipment from the construction of von Neumann’s computer were stored amid bales of hay, spring-tooth harrows, and other remnants of the farm’s working life. I was one of a small band of eight- to ten-year-olds who spent our free time exploring the Institute Woods, and occasionally visited the barn. A few beams of sunlight perforated the roof through dust raised by pigeons that fluttered away from us overhead.

  Julian’s cache of war surplus electronics had already been scavenged for needed parts. We had no idea what most of it was—but that did not stop us from dismantling anything that would come apart. We knew that Julian Bigelow had built a computer, housed in a building off-limits to children, just as we knew that Robert Oppenheimer, who lived in the manor house belonging to the barn, had built an atomic bomb. On our expeditions into the woods, we ignored birds and mammals, hunting for frogs and turtles that we could capture with our bare hands. It was still the age of reptiles to us. The dinosaurs of computing, in contrast, were warm-blooded, but the relays and vacuum tubes we extracted from their remains had already given up their vital warmth.

  I was left with an inextinguishable curiosity over the relics that had been abandoned in the barn. “Institutes like nations are perhaps happiest if they have no history,” declared Abraham Flexner, the founding director of the Institute for Advanced Study, in 1936. It was thanks to this policy of Dr. Flexner’s, maintained by his successors, including Oppenheimer, regarding the history of the Institute in general, and the history of the Electronic Computer Project in particular, that much of the documentary material behind this book lay secreted for so many years. “I am reasonably confident that there is nothing here that would interest him,” Carl Kaysen, Oppenheimer’s successor, noted in response to an inquiry, in 1968, about records concerning the von Neumann computer project from a professor of electrical engineering at MIT.2

  Thanks to former director Phillip Griffiths, with support from trustees Charles Simonyi and Marina von Neumann Whitman, I was invited to spend the 2002–2003 academic year as Director’s Visitor at the Institute for Advanced Study, and granted access to files that in some cases had not seen the light of day since 1946. Historical Studies–Social Science Librarian Marcia Tucker and Archivist Lisa Coats began working to preserve and organize the surviving records of the Electronic Computer Project, and Kimberly Jacobsen transcribed thousands of pages of documents that are only sparsely sampled here. Through the efforts of current director Peter Goddard, and a gift from Shelby White and the Leon Levy Foundation, a permanent Archives Center has now been established at the IAS. Archivists Christine Di Bella, Erica Mosner, and all the staff at the Institute, especially Linda Cooper, helped in every capacity, and the current trustees, especially Jeffrey Bezos, have lent continuing encouragement and support.

  Many of the surviving eyewitnesses—including Alice Bigelow, Julian Bigelow, Andrew and Kathleen Booth, Raoul Bott, Martin and Virginia Davis, Akrevoe Kondopria Emmanouilides, Gerald and Thelma Estrin, Benoît Mandelbrot, Harris Mayer, Jack Rosenberg, Atle Selberg, Joseph and Margaret Smagorinsky, Françoise Ulam, Nicholas Vonneumann, Willis Ware, and Marina von Neumann Whitman—took time to speak with me. “You’re within about five years of not having a testifiable witness,” Joseph Smagorinsky warned me in 2004.

  In 2003 the Bigelow family allowed me to go through the boxes of papers that Julian had saved. In one box, amid Office of Naval Research technical reports, World War II vacuum tube specification sheets, Bureau of Standards newsletters, and even a maintenance manual for the ENIAC, stamped RESTRICTED, was a sheet of lined paper that had evidently been crumpled up and thrown away, then uncrumpled and saved. It had been turned sideways, and had one line of handwriting across the top of the page, as follows:

  Orders: Let a word (40bd) be 2 orders, each order = C(A) = Command (1–10, 21–30) • Address (11–20, 31–40)

  The use of bd for binary digit dates this piece of paper from the beginning of the von Neumann project, before the abbreviation of binary digit to bit.

  “In the beginning,” according to Neal Stephenson, “was the command line.” Thanks to Neal, and many other supporters, especially those individuals and institutions who allowed me into their basements, I spent an inordinate amount of time, over the past eight years, immersed in the layers of documents that were deposited when the digital universe was taking form. From Alex Magoun at RCA to Willis Ware at RAND, and many other keepers of institutional memory in between—including the Annals of the History of Computing and the Charles Babbage Institute’s oral history collection—I am indebted to those who saved records that otherwise might not have been preserved. To a long list of historians and biographers—including William Aspray, Armand Borel, Alice Burks, Flo Conway, Jack Copeland, James Cortada, Martin Davis, Peter Galison, David Alan Grier, Rolf Herken, Andrew Hodges, Norman Macrae, Brian Randell, and Jim Siegelman—I owe more than is acknowledged here. All books owe their existence to previous books, but among the antecedents of this one should be singled out (in chronological order) Beatrice Stern’s History of the Institute for Advanced Study, 1930–1950 (1964), Herman Goldstine’s The Computer from Pascal to von Neumann (1972), Nicholas Metropolis’s History of Computing in the Twentieth Century (1980), Andrew Hodges’s Alan Turing: The Enigma (1983), Rolf Herken’s The Universal Turing Machine: A Half-Century Survey (1988), and William Aspray’s John von Neumann and the Origins of Modern Computing (1990).

  Julian Bigelow and his colleagues designed and built the new computer in less time than it took me to write this book. I thank Martin Asher, John Brockman, Stefan McGrath, and Katinka Matson for their patience in allowing this. The Bigelow family, the Institute for Advanced Study, Françoise Ulam, and, especially, Marina von Neumann Whitman provided access to the documents that brought this story to life. Gabriella Bollobás translated a large body of correspondence, interpreting not only the nuances of the Hungarian language, but also the emotional and intellectual context of Budapest at that time. Belá Bollobás, Marion Brodhagen, Freeman Dyson, Joseph Felsenstein, Holly Given, David Alan Grier, Danny Hillis, Verena Huber-Dyson, Jennifer Jacquet, Harris Mayer, and Alvy Ray Smith offered comments on early drafts. Akrevoe Kondopria Emmanouilides, who typed and proofread the Institute for Advanced Study Electronic Computer Project’s progress reports as a teenager in 1946, found errors that might otherwise have been missed.

  Finally, thanks to those who sponsored the work that forms the subject of this book. “While old men in congresses and parliaments would debate the allocation of a few thousand
dollars, farsighted generals and admirals would not hesitate to divert substantial sums to help oddballs in Princeton, Cambridge, and Los Alamos,” observed Nicholas Metropolis, reviewing the development of computers after World War II.3

  Early computers were built in many places, leaving fossils that remain well preserved. But what, exactly, once everything else was in place, sparked the chain reaction between address matrix and order codes, spawning the digital universe in which we are all now immersed?

  C(A) is all it took.

  PRINCIPAL CHARACTERS

  Katalin (Lili) Alcsuti (1910–1990): John von Neumann’s younger cousin and granddaughter of von Neumann’s maternal grandfather, Jacob Kann (1854–1914).

  Hannes Alfvén (1908–1995): Swedish American magnetohydrodynamicist and author (under pseudonym Olof Johannesson) of The Tale of the Big Computer.

  Frank Aydelotte (1880–1956): Second director of the Institute for Advanced Study (IAS), 1939–1947.

  Louis Bamberger (1855–1944): Newark, New Jersey, department store magnate and founder, with sister Carrie Fuld, of the IAS.

  Nils Aall Barricelli (1912–1993): Norwegian Italian mathematical biologist and viral geneticist; at the IAS in 1953, 1954, and 1956.

  Julian Himely Bigelow (1913–2003): American electronic engineer and collaborator, with Norbert Wiener, on antiaircraft fire control during World War II; chief engineer of the IAS Electronic Computer Project (ECP), 1946–1951.

  Andrew Donald Booth (1918–2009): British physicist, X-ray crystallographer, inventor, and early computer architect; visited the IAS ECP in 1946 and 1947.