Free Novel Read

Turing's Cathedral Page 18


  “I didn’t talk to von Neumann very often,” says Ralph Slutz. “It was more like I talked to Bigelow and Bigelow talked to von Neumann. You know, the Cabots speak to the Lodges and the Lodges speak to God.” Conversations with von Neumann were often long-distance calls. “He had a habit of telephoning at any hour of the day or night,” says Goldstine. “Even at 2 in the morning he might telephone and say ‘I see how to do this.’ And then tell you. The main problem with working long distance with von Neumann was that telephone connections weren’t that good in those days, and von Neumann spent most of his time saying ‘Hello!’ So that whenever the line was clear, all we were doing was being busy saying ‘Hello.’ But in spite of these things, we got a lot done by these means.”54

  Von Neumann wanted to know how everything worked, but he left it to the engineers to make it work. “The experimental business wasn’t really for von Neumann,” Goldstine explains. “Once he understood the principle of it, the ghastly details like the fact that you’d have to put by-pass condensers on things, and all sorts of dirty engineering things—that didn’t really interest him. He recognized that these were essential, but it wasn’t his thing. He would not have had the patience to sit there and do it; he would have made a lousy engineer.”55

  According to Bigelow, “Von Neumann had one piece of advice for us: not to originate anything.” This helped put the IAS project in the lead. “One of the reasons our group was successful, and got a big jump on others, was that we set up certain limited objectives, namely that we would not produce any new elementary components,” adds Bigelow. “We would try and use the ones which were available for standard communications purposes. We chose vacuum tubes which were in mass production, and very common types, so that we could hope to get reliable components, and not have to go into component research.”56

  The fundamental, indivisible unit of information is the bit. The fundamental, indivisible unit of digital computation is the transformation of a bit between its two possible forms of existence: as structure (memory) or as sequence (code). This is what a Turing Machine does when reading a mark (or the absence of a mark) on a square of tape, changing its state of mind accordingly, and making (or erasing) a mark somewhere else. To do this at electronic speed requires a binary element that can preserve a given state over time, until, in response to an electronic pulse or some other form of stimulus, it either changes or communicates that state. “Most of the essential elements or ‘Cells’ in the machine are of a binary, or ‘on-off’ nature,” Bigelow and his colleagues explained in their first interim progress report. “Those whose state is determined by their history and are time-stable are memory elements. Elements of which the state is determined essentially by the existing amplitude of a voltage or signal are called ‘gates.’ ”57

  In 1946, on the eve of the transistor, it was uncertain whether the non-zero probability of error in any individual digital transformation would bring a computation involving millions of transformations to a halt. The ENIAC was the only large-scale precedent. “The mere fact that the ENIAC was and that the ENIAC ran gave me tremendously more confidence that something could be done than would have been the case if there hadn’t been that demonstration of such a large machine running,” says Ralph Slutz.58 But the new machine would be to the ENIAC as the ENIAC was to a desktop calculator. What available fundamental computational element was reliable enough to work?

  The answer was the 6J6, a miniature twin-triode vacuum tube that was produced in enormous numbers during and after World War II. Three-quarters of an inch in diameter and 2 inches in length, with a 7-pin base, the 6J6 drove military communications during the war and the consumer electronics industry that followed. Effectively two tubes in one envelope, a common cathode (pin 7) served two separate plates (pins 1 and 2) and grids (pins 5 and 6). The twin-triode architecture allowed the tube to be used as a “toggle,” with one side or the other in a conducting state, and less than a microsecond required to make the switch. “And that was the word insisted on by Julian Bigelow as being the more accurate word for what the flip-flop does. And he’s right,” says Pomerene. “Flip-flop is not the right word for a bi-stable circuit which stays in whatever state you put it in.” This constituted a far more secure representation of binary data than an element whose state is represented by simply being on or off—where failure is indistinguishable from one of the operational states. As Bigelow later described it, “A binary counter is simply a pair of bistable cells communicating by gates having the connectivity of a Möbius strip.”59

  “If the 6J6, which was the twin triode, had not existed during the war and had not been widely used, I don’t know what we would have used for a tube,” says Willis Ware. Not only did the widespread use of the 6J6 mean that it was available inexpensively, but it was found to be more reliable as well. One of Bigelow’s last assignments at the Statistical Research Group at Columbia had involved the reliability of munitions. “There had been a lot of accidental explosions of rocket propellant units on airplanes in which the explosion would take the wing off a plane,” he explains. “And this would happen in a very rare and erratic fashion. So we had some excellent people in statistics there, including no less than Abraham Wald, who founded sequential analysis while working with our group. Statistical thinking had become a part of my way of thinking about life.” It turned out that the most reliable tubes were those produced in the largest quantities—such as the 6J6. As Bigelow described it, “We learned that tube types sold at premium prices, and claimed to be especially made for long life, were often less reliable in regard to structural failures than ordinary tube types manufactured in larger production lots.”60

  That higher quality did not require higher cost was not readily accepted, especially since IBM, who had used the 6J6 as the computing element in its popular model 604 electronic calculator, had recently established its own experimental tube production plant in Poughkeepsie, New York, to develop special computer-quality tubes at a much higher cost. There was intense debate over whether the choice of the mass-market 6J6 was a mistake. Of the final total of 3,474 tubes in the IAS computer, 1,979 were 6J6s. “The entire computer can be viewed as a big tube test rack,” Bigelow observed.61

  “It was considered essential to know whether such miniature tubes as the 6J6 have radically inferior lives compared to other types, to an extent rendering their use in design a major blunder; and accordingly a crude life-test set up was devised and operated to get some sort of a statistical bound on their reliability,” Bigelow reported at the end of 1946. Four banks of 6J6 tubes, twenty in each bank, for a total of eighty tubes, were installed in a test rack so they were oriented up, down, and in the two horizontal positions (cathode edge-wise and cathode flat). The entire rack was mounted on a vibrating aluminum plate, and the tubes left to run for three thousand hours. “A total of six failed, four within the first few hours, one about 3 days and one after 10 days,” was the final report. “There were four heater failures, one grid short and one seal failure.”62

  The problem wasn’t tubes that failed completely—built-in self-diagnostic routines made these easy to identify and replace—it was tubes that either were not up to specification in the first place, or that drifted off specification with age. How could you count on getting correct results? While von Neumann was beginning to formulate, from the top down, the ideas that would develop into his 1951 “Reliable Organizations of Unreliable Elements” and 1952 “Probabilistic Logics and the Synthesis of Reliable Organisms from Unreliable Components,” the technicians at the Institute were facing the same problem, from the bottom up.

  It took an engineer fluent in wartime electronics and wartime ingenuity to solve the problem of building a reliable computer from unreliable war surplus parts. Jack Rosenberg, from New Brunswick, New Jersey, was the first in his family to attend college, entering MIT in 1934 at age sixteen. As a senior in high school he had attended the Century of Progress exhibit in Chicago, and “spent nearly the whole week in what was called the Hall o
f Science. And I saw there a booth by MIT, and I talked to the man at the booth, and he said MIT is probably the toughest school to get into. So I applied to MIT.”

  Rosenberg started out in mathematics but switched to electrical engineering, graduating at the top of his class with two degrees. “When I went around for interviews in 1939, I saw many of my classmates already working,” he says. “I knew that I was more intelligent than them, but that’s the way it went.” So he took a job as a civilian engineer for the U.S. Army Signal Corps, becoming an officer when the United States joined the war.

  July of 1945 found Rosenberg on board an army troop ship sailing at eight knots across the Pacific to the Philippines, to prepare for the invasion of Japan. “As a radio ham, I spent most of my waking time in the radio room listening to the short wave,” he says. Since the slow transport was a sitting duck, transmission was not allowed. On August 6, 1945, he heard news of the atomic bombing of Hiroshima, followed by news of the bombing of Nagasaki on August 9. “The ship’s troop commander was as startled as I had been,” he says. “He told me to keep listening to the radio. His orders for the invasion had not been changed.” Then the news of Japan’s unconditional surrender arrived. “The bombs had saved our lives,” says Rosenberg, and however difficult von Neumann (and Oppenheimer) proved to be as his employers, he never forgot that.63

  Rosenberg remained in the southern Philippines until April 1946. In the post exchange, he found a copy of Atomic Energy for Military Purposes, a swiftly declassified nontechnical account of the Manhattan Project by Henry Smyth, chairman of the physics department at Princeton University. Upon his discharge from the army, at Fort Dix, New Jersey, in July of 1946—having returned across the Pacific on a turbine-driven steamship at thirty knots—Rosenberg went to Princeton to seek a job in nuclear energy research. He was hired by the Physics Department to work on the instrumentation for the university’s new cyclotron, but, he says, “my enthusiasm lasted about a month.”

  “Early in 1947,” he continues, “I was informed that at the Institute for Advanced Study, a famous scientist was looking for an engineer to develop an electronic machine of a sort no one but he understood.” Rosenberg interviewed with Bigelow and von Neumann, and started work in July. “There was a lot of anti-Semitism in the army. But there wasn’t anti-Semitism with Johnny,” he says.

  “Johnny used to meet with each of us individually about once a week, asking what we had built, how it worked, what problems we had, what symptoms we observed, what causes we had diagnosed,” says Rosenberg. “Each question was precisely the best one based on the information he had uncovered so far. His logic was faultless—he never asked a question that was irrelevant or erroneous. His questions came in rapid-fire order, revealing a mind that was lightning-fast and error-free. In about an hour he led each of us to understand what we had done, what we had encountered, and where to search for the problem’s cause. It was like looking into a very accurate mirror with all unnecessary images eliminated, only the important details left.”64

  When Rosenberg arrived, the problem was how to build a forty-stage shift register, this being at the heart of the machine’s ability to compute. “It was easy to build a two-stage register that worked reliably,” says Rosenberg. “When a third stage was added occasional errors crept in. Adding a fourth stage made the register useless. We discovered that the electrical characteristics of the vacuum tubes were very different from the specifications published for them in tube handbooks, even when the tubes were new.”

  According to Rosenberg, after further extensive testing, and consultation with the major tube manufacturers, whose response was that “no one else had ever complained about their product, and they had enough customers without us,” von Neumann was informed that “there were no reliable tubes, and no reliable resistors.” The response from him was that “we would have to learn how to design a reliable 40-stage machine with thousands of unreliable components,” said Rosenberg. And they did.65

  They switched from designing according to the published tube specifications to what is now called “worst-case design”—which, “in recognition of the then current new fashion in women’s clothing, Bigelow called ‘The New Look.’ ” As Ralph Slutz explains it, “we tested a batch of a thousand tubes and took the weakest tube we found and the strongest tube we found, and then allowed an extra 50% safety factor over that.”66

  The new design parameters were extended from individual tubes to toggles, gates, standard circuit modules, and finally to full-scale forty-stage registers, which, after tedious debugging, worked. Bigelow also argued, counterintuitively, that the machine’s overall reliability could be improved by speeding it up, noting that “increasing speed may actually increase certainty rather than the reverse.” Unlike mechanical devices, vacuum tubes are weakened by age, not use, and “suffer accidental failures in proportion to their population,” not their operating speed. Optimum reliability can therefore be achieved by operating as few tubes as possible, at maximum speed. “Finally,” noted Bigelow, “intermittent errors are most embarrassing and difficult to detect when the intermittency corresponds roughly to the operating rate.”67

  The IAS engineers coaxed collectively acceptable digital behavior out of tubes whose performance would have been largely unacceptable if tested against the specifications for individual tubes. This was achieved by following the same principle that Bigelow and Wiener had developed in their work on the debomber: separate signal from noise at every stage of the process—in this case, at the transfer of every single bit—rather than allowing noise to accumulate along the way. This, as much as the miracle of silicon, is why we have microprocessors that work so well today. The entire digital universe still bears the imprint of the 6J6.

  “One time we thought it would be a good idea to take the vacuum tubes out of the machine and just run them through a regular testing routine,” remembers James Pomerene, “and you never saw a crummier bunch of tubes in your life!”68

  EIGHT

  V-40

  Far as all such engines must ever be placed at an immeasurable interval below the simplest of Nature’s works, yet, from the vastness of those cycles which even human contrivance in some cases unfolds to our view, we may perhaps be enabled to form a faint estimate of the magnitude of that lowest step in the chain of reasoning, which leads us up to Nature’s God.

  —Charles Babbage, 1837

  PENTAERYTHRITOL TETRANITRATE (PETN) was an important high explosive in World War I, yet its molecular structure remained unknown in World War II. Andrew Booth, whose father invented, among other things, the automotive ignition advance and a stovetop thermocouple designed to power a radio in a household without electricity, was a graduate student at Birmingham University when he was assigned the job of using X-ray crystallography to find this out.

  Booth, who had “alarmed his mother by mending fuses at the age of two,” won an undergraduate mathematical scholarship to Cambridge University in 1937, where he was assigned to pure mathematician G. H. Hardy, a match doomed from the start. Given an ultimatum by Hardy either to quit wasting time on other subjects or to lose his scholarship, Booth, who believed that “if mathematics isn’t useful, it’s not worth doing,” left Cambridge to pursue physics, engineering, and chemistry on his own terms. During an apprenticeship at the Armstrong Siddeley aircraft engine works in Coventry, he improved the design of searchlights and established an X-ray facility for the inspection of engine parts. This led to a graduate scholarship, sponsored by the British Rubber Producers Research Association, to determine the molecular structure of PETN, although this was not made explicit at the time. “I was just given some stuff and asked to find out what the structure was,” he explains. “Which we did, and of course we realized it was what it was by then.”1 In addition to PETN, Booth’s group derived the structure of RDX, a new plastic explosive that would play a key role in the development of atomic bombs.

  By recording the diffraction patterns produced by X rays scattered from a sample
of crystalline material, it is possible, if difficult, to infer the pattern of electron densities, and thus the molecular structure, that produced the observed diffraction. Given a known molecular structure, it is easy to predict the pattern of diffraction, but given an observed diffraction pattern, there is no easy way to determine the molecular structure. You start by taking a guess, and then perform the calculations to see if the guess was anywhere near correct. After many iterations, a plausible structure may—or may not—emerge.

  “What people did previous to the time when I came along with the computing suggestions was to jiggle things about and hope for the best,” remembers Booth. “And it took a hell of a long time, and usually gave a very bad result.” The physics behind the scattering of X rays was straightforward, but to work backward took brute force. “There is a lot of very nasty calculation,” he says. “For a typical structure like the one I did, there are about four thousand reflections … and you have to calculate the sum over all of the wretched atoms in your structure to calculate this wretched phase angle. It takes a hell of a long time by hand. I had a boy and one girl who were helping me with the calculations. It took us three years.”2

  With the structure of PETN determined, and his PhD in hand, Booth transferred to the main BRPRA laboratories at Welwyn Garden near London. Under the supervision of John W. Wilson, he began to build a series of mechanical and electromechanical calculators to speed up the X-ray analysis work. This attracted the attention of crystallographer Desmond Bernal, who was launching a “bio-molecular” laboratory at Birkbeck College to tackle more complex organic molecules that had been resistant to structural analysis so far. “I was building this special-purpose crystallographic digital computer,” says Booth. “That’s what Bernal was interested in, and that’s why he wanted me to come work with him.” One of the members of Bernal’s group was Rosalind Franklin, later to play a key role in determining the helical structure of DNA.