A review of postings about scientific integrity and intellectual honesty, with observations regarding elite centrism – Part 5: inventions, innovations, and ushering of ‘the new normal’ (i)

(Continued from Part 4)

The electronic computer ranks among the foremost in the history of scientific and technological innovations. The mathematician John von Neumann, who played important roles in World War II U.S. military science projects, is often regarded as the “father of computers” for his key participation in the development of ENIAC, the first general-purpose electronic computer, at the Moore School of Electrical Engineering, the University of Pennsylvania, and his subsequent leading role in spreading the development of computers:

“The Moore School signed a research contract with the Ballistic Research Laboratory (BRL) of the U.S. Army, and in an August 1942 memorandum Mauchly proposed that the school build a high-speed calculator out of vacuum tube technology for the war effort. In 1943, the army granted funds to build the Electronic Numerical Integrator and Computer (ENIAC) to create artillery ballistic tables. Eckert served as chief engineer of a team of fifty engineers and technical staff on the ENIAC project.

Completed in 1945, the ENIAC consisted of 49-foot-high cabinets, almost 18,000 vacuum tubes, and many miles of wiring, and weighed 30 tons. …

… While building the ENIAC, Mauchly and Eckert developed the idea of the stored program for their next computer project, where data and program code resided together in memory. The concept allowed computers to be programmed dynamically so that the actual electronics or plugboards did not have to be changed with every program.

… During World War II, von Neumann worked on the Manhattan Project to build the atomic bomb and also lent his wide expertise as a consultant on other defense projects.

After becoming involved in the ENIAC project, von Neumann expanded on the concept of stored programs and laid the theoretical foundations of all modern computers in a 1945 report and through later work. His ideas came to be known as the “von Neumann Architecture,”, and von Neumann is often called the “father of computers.” … After the war, von Neumann went back to Princeton and persuaded the Institute for Advanced Study to build their own pioneering digital computer, the IAS (derived from the initials of the institute), which he designed.

Eckert and Mauchly deserve equal credit with von Neumann for their innovations, though von Neumann’s elaboration of their initial ideas and his considerable prestige lent credibility to the budding movement to build electronic computers. …”

(Eric G. Swedin and David L. Ferro, Computers: The Life Story of a Technology, 2005, The Johns Hopkins University Press)

In a February 2013 blog post, I wondered about the prospect in the mid-1950s that von Neumann, then stricken with cancer, was to move from the Institute for Advanced Study in Princeton, New Jersey, to the University of California; “I bet it was my alma mater UC Berkeley”, I marvelled at how much his coming would have been a boost to Berkeley’s science and the nascent Silicon Valley’s technology:

“In any case, it was a pity for the University of California that John von Neumann died at his prime age, as before his death he had decided to resign from the Institute for Advanced Study in Princeton and move to one of the UC campuses, as also revealed in the book quoted about him:

“When Johnny was in hospital in 1956, with what proved to be his terminal cancer, he wrote to Oppenheimer and explained, although not yet for publication, that he was not in fact going to come back to the IAS. He had privately accepted an offer to be professor at large at the University of California: he would live near one of its campuses (it had not been quite decided which) and proceed with research on the computer and its possible future uses, with considerable commercial sponsorship. We cannot know how much he would then have enriched our lives, with cellular automata, with totally new lines for the computer, with new sorts of mathematics.”

I bet it was my alma mater UC Berkeley John von Neumann was about to move to, given its close collaborations with several National Labs that had nuclear science and weapons researches, and its proximity to what would become Silicon Valley around Stanford University across the Bay as discussed in Part 1.

The scientific and technological history of the Bay Area would have looked very different, so much more – had “Johnny” come to his senses earlier, I sigh.”

(“Guinevere and Lancelot – a metaphor of comedy or tragedy, without Shakespeare but with shocking ends to wonderful lives (Part 2)”, February 28, 2013, Feng Gao’s Posts – Rites of Spring)

What I quoted from in the above was Norman MacRae’s book, John Von Neumann: The Scientific Genius Who Pioneered the Modern Computer, Game Theory, Nuclear Deterrence, and Much More, originally published in 1992.

Von Neumann soon died, and his intended move has left room for imagination since the University of California had several major campuses and “it had not been quite decided which” according to MacRae.

My educated guess of UC Berkeley was based on the close relations Berkeley had with the nearby national laboratories – the Lawrence Berkeley National Lab and the Lawrence Livermore National Lab discussed in Part 4 – in nuclear science and weapons research, of top interest to von Neumann.

For instance, in his involvement in the development of the hydrogen bomb, von Neumann had spent time at the Livermore lab with the physicist Edward Teller, Cold War strategist Herman Khan, and others:

“Herman Kahn … went to U.C.L.A. and majored in physics. During the war, he served in the Pacific theatre in a non-combat position, then finished his B.S. and entered a Ph.D. program at Cal Tech. He failed to graduate… went to work at rand. He became involved in the development of the hydrogen bomb, and commuted to the Livermore Laboratory, near Berkeley, where he worked with Edward Teller, John von Neumann, and Hans Bethe.”

(“Fat Man: Herman Kahn and the nuclear age”, by Louis Menand, June 27, 2005, The New Yorker)

On the other hand, UCLA has claimed that von Neumann had planned to move there, before his premature death in February 1957, because of his close association with a computer project at RAND Corporation in Santa Monica in the Los Angeles region:

1950_____________________________________

At RAND Corporation in Santa Monica, a project to build a von Neumann type machine was closely tracking the ongoing development t the Institute for Advanced Studies in Princeton (von Neumann often came West to consult at RAND and, in fit, his plan to relocate to UCLA was aborted by his death in February 1957).”

(“THE HISTORY OF COMPUTER ENGINEERING & COMPUTER SCIENCE AT UCLA”, by Gerald Estrin, Computer Science Department, UCLA Engineering)

As cited, the RAND computer project closely followed von Neumann’s computer design at the IAS in Princeton.

RAND was the Cold War think-tank from which John Nash, whose Princeton Ph.D. thesis idea in game theory had once been dismissed by von Neumann, was expelled in 1954 due to a police arrest for homosexual activity in a public restroom, here with more details than a previous quote in Part 2:

“That August… He spent hours at a time walking on the sand or long the promenade in Palisade Park, watching the bodybuilders on Muscle Beach…

One morning at the very end of the month, the head of RAND’s security detail got a call from the Santa Monica police station, which, as it happened, wasn’t far from RAND’s new headquarters on the far side of Main. It seemed that two cops in vice, one decoy and one arresting officer named John Otto Mattson, had picked up a young guy in a men’s bathroom in Palisades Park in the very early morning. … The man, who looked to be in his mid-twenties, claimed that he was a mathematician employed by RAND. Was he?

… Nash had a top-secret security clearance. He’d been picked up in a “police trap.” …

Nash was not the first RAND employee to be caught in one of the Santa Monica police traps. Muscle Beach, between the Santa Monica pier and the little beach community of Venice, was a magnet for bodybuilders and the biggest homosexual pickup scene in the Malibu Bay area. …”

(Sylvia Nasar, A Beautiful Mind, 1998, Simon & Schuster)

Unlike John Nash, a libidinous young gay man incessantly roaming public beaches and parks and thus a misfit for RAND’s security sensitivity, John von Neumann was a leading Cold War brain of the think-tank, alongside Herman Khan:

“… RAND is a civilian think tank … described by Fortune in 1951 as “the Air Force’s big-brain-buying venture”, where brilliant academics pondered nuclear war and the new theory of games. …

Nothing like the RAND of the early 1950s has existed before or since. It was the original think tank, a strange hybrid of which the unique mission was to apply rational analysis and the latest quantitative methods to the problem of how to use the terrifying new nuclear weaponry to forestall war with Russia – or to win a war if deterrence failed. The people of RAND were there to think the unthinkable, in Herman Kahn’s famous phrase. … And Kahn and von Neumann, RAND’s most celebrated thinkers, were among the alleged models for Dr. Strangelove. … RAND had its roots in World War II, when the American military, for the first time in its history, had recruited legions of scientists, mathematicians, and economists and used them to help win the war. …”

(Sylvia Nasar, 1998, Simon & Schuster)

Beyond RAND or Livermore, von Neumann’s advocacy for the use of nuclear weapons was famous, rare among scientists, and framed in the mindset of “world government” – as in Part 2 a political ideal Nash was also drawn to and attempted to campaign for:

“After the Axis had been destroyed, Von Neumann urged that the U.S. immediately build even more powerful atomic weapons and use them before the Soviets could develop nuclear weapons of their own. It was not an emotional crusade, Von Neumann, like others, had coldly reasoned that the world had grown too small to permit nations to conduct their affairs independently of one another. He held that world government was inevitable—and the sooner the better. But he also believed it could never be established while Soviet Communism dominated half of the globe. A famous Von Neumann observation at that time: “With the Russians it is not a question of whether but when.” A hard-boiled strategist, he was one of the few scientists to advocate preventive war, and in 1950 he was remarking, “If you say why not bomb them tomorrow, I say why not today? If you say today at 5 o’clock, I say why not 1 o’clock?”

(“Passing of a Great Mind: John von Neumann, a Brilliant, Jovial Mathematician, Was a Prodigious Servant of Science and His Country”, by Clay Blair, Jr., February 25, 1957, Volume 42, Number 8, Life)

So for von Neumann, world government needed to be achieved through war, even nuclear war, rather than through the kind of peace movement Nash wished for but failed to start.

Von Neumann’s contributions as a military technology consultant and a Cold War adviser were immense:

“His fellow scientists… knew that during World War II at Los Alamos Von Neumann’s development of the idea of implosion speeded up the making of the atomic bomb by at least a full year. His later work with electronic computers quickened U.S. development of the H.bomb by months. The chief designer of the H-bomb, Physicist Edward Teller, once said with wry humor that Von Neumann was “one of those rare mathematicians who could descend to the level of the physicists.” …

… The principal adviser to the U.S. Air Force on nuclear weapons, Von Neumann was the most influential scientific force behind the U.S. decision to embark on accelerated production of intercontinental ballistic missiles. …”

(Clay Blair, Jr., February 25, 1957, Volume 42, Number 8, Life)

Without von Neumann’s ambitious push, the United States would have fallen behind the Soviet Union in the development of intercontinental ballistic missiles (ICBM), namely long-range strategic nuclear missiles:

“In the early 1950s, the champion of strategic bombers in the United States was the famous, truculent, imperious Gen. Curtis LeMay, the chief of the Strategic Air Command, who, during the last months of World War II, had tried to break Japan’s will and avert the necessity of an American invasion by dropping 150,000 tons of firebombs on Japanese cities. …

In the Pentagon of the 1950s, LeMay was “king of the mountain,” as one colleague put it, known for pulverizing those few men who tried to stand in his way. …

Lacking LeMay’s blinders, Bennie Schriever realized that the Soviets planned to rest their future defense not on bombers but on intercontinental ballistic missiles capable of striking the United States with only 15 minutes of advance warning. The Kremlin was also fast improving batteries of surface-to-air missiles that could knock LeMay’s beloved bombers out of the sky. …

Schriever’s new way of thinking began in 1953, when he was still a colonel. During a briefing on intermediate-range bombers at Maxwell Air Force Base in Alabama, he had a fateful conversation with the legendary refugee scientists Edward Teller and John von Neumann. They predicted that by 1960, the United States would be creating hydrogen bombs so lightweight that missiles could carry them. The following year, Schriever, by then a general, was asked to supervise, on highest priority, the creation of some kind of ICBM force. …”

(“Missile Defense”, by Michael Beschloss, October 1, 2009, The New York Times)

But as discussed earlier, in hospital for cancer treatment in 1956 von Neumann expressed the wish to move to California to conduct computer research.

The computer developed at RAND in association with von Neumann was name JOHNNIAC in honor of him, and was one of the most utilized computers of that early generation:

JOHNNIAC (circa 1952-1966)

The JOHNNIAC (John von Neumann Integrator and Automatic Computer) was a product of the RAND Corporation. It was yet another machine based on the Princeton Institute IAS architecture. The JOHNNIAC was named in von Neumann’s honor, although it seems that von Neumann disapproved. JOHNNIAC is arguably the longest-lived of the early computers. It was in use almost continuously from the end of the Korean War, until finally shut down on 11 February 1966 after logging over 50,000 operational hours. After two “rescues” from the scrap heap, the machine currently resides at the Computer History Museum.”

(Marshall William McMurran, ACHIEVING ACCURACY: A Legacy of Computers and Missiles, 2008, Xlibris Corporation)

UCLA’s claim that it and RAND were von Neumann’s choice in 1956 is thus consistent with MacRae’s account of von Neumann’s wish, that he wanted to concentrate on “research on the computer and its possible future uses, with considerable commercial sponsorship” – von Neumann’s prior association with RAND and JOHNNIAC made it more likely for UCLA rather than UC Berkeley to be his choice.

In that vein I would guess anew, that with von Neumann’s guidance RAND could have started a ‘Computer Beach’ at Santa Monica. It could resemble the Silicon Valley’s start in a Palo Alto garage by the Hewlett-Packard Company’s founders under the mentorship of Frederick Terman, their Stanford University professor:

The Rise of Silicon Valley

In 1939, with the encouragement of their professor and mentor, Frederick Terman, Stanford alumni David Packard and William Hewlett established a little electronics company in a Palo Alto garage. That garage would later be dubbed “the Birthplace of Silicon Valley.””

(“History of Stanford”, Stanford University)

There would have been plenty of time to build a computer industry for von Neumann and RAND, as RAND was much more than a garage and Hewlett-Packard’s first computer would come out only 10 years later in 1966 – the year of JOHNNIAC’s retirement.

(“Hewlett-Packard”, 2008, Silicon Valley Historical Association)

My imagined Computer Beach versus the nascent Silicon Valley would have been one scenario, while UCLA versus UC Berkeley was another factor von Neumann likely considered.

Stricken with cancer, von Neumann likely came to think of his involvement in the atomic bomb development as an occupational hazard, as I discussed in my February 2013 blog post, again quoting from MacRae:

“After his intimate participations in advanced military researches during World War II and afterwards, including in the development of the nuclear bomb, John von Neumann died of cancer in 1957 at only 53, and there has been a question whether his premature death had been work-related:

“It is plausible that in 1955 the then-fifty-one-year-old Johnny’s cancer sprang from his attendance at the 1946 Bikini nuclear tests. The safety precautions at Bikini were based on calculations that were meant to keep any observer’s exposure to radiation well below what had given Japanese at Hiroshima even a 1 percent extra risk of cancer. But Johnny, like some other great scientists, underestimated risks at that time. He was startled when radiation probably caused the cancer and death in 1954 at age fifty-three of his great friend Fermi, whose 1930s experiments with nuclear bombardment in Italy were not accompanied by proper precautions. Soon after a Soviet nuclear test in 1953 Sakharov and Vyacheslav Malyshev walked near the site to assess its results. Sakharov ascribed Malyshev’s death from leukemia in 1954, and possibly his own terminal illness thirty-five years later, to that walk.”

So von Neumann and some other great scientists in the nuclear bomb development may have “underestimated” the health risks, and he and Enrico Fermi who had discovered nuclear reaction, both died at the age of 53.

Hmm, I doubt that a mathematician of von Neumann’s caliber would have incorrectly calculated the cancer risks from radiation.”

(February 28, 2013, Feng Gao’s Posts – Rites of Spring)

I doubted in that blog post, and still do, that a mathematician and scientist of John von Neumann’s caliber would have underestimated his risk of getting cancer from nuclear radiation. But the facts remain that both von Neumann and Enrico Fermi, whose discovery of nuclear reaction made the atomic bomb possible, died of cancer at the same age of 53.

Moreover, as I noted in that blog post, previously quoted in Part 4, that former UC Berkeley physicist Robert Oppenheimer, leader of the atomic bomb development and director of the IAS to whom von Neumann expressed the wish to move to UC, later also died of cancer – 10 years older at the age of 63:

“The physicist Robert Oppenheimer, the director of IAS at Princeton with whom von Neumann discussed his pending move in 1956, had hailed from UC Berkeley to become “father of the atomic bomb”, leading the development of nuclear bombs at Los Alamos National Lab founded by him in northern New Mexico; Oppenheimer later also died of cancer, at the age of 63.”

(February 28, 2013, Feng Gao’s Posts – Rites of Spring)

So it is possible that in 1956 cancer prompted von Neumann’s final decision that would be to stay away from UC Berkeley and the nearby nuclear labs.

But von Neumann was also a professor of mathematics, at Princeton’s Institute for Advanced Study and, if he had moved to California, at UCLA or UC Berkeley, the two oldest primary UC campuses as in Part 4. The academic factors could also sway his decision.

At UCLA there was a computer, SWAC (Standards Western Automatic Computer), developed at the Institute for Numerical Analysis sponsored by the National Bureau of Standards and funded by the Office of Naval Research:

1947____________________________________________

The Institute for Numerical Analysis was set up on the UCLA campus under sponsorship of the National Bureau of Standards and with funding from the Office of Naval Research. The primary function of INA was “to conduct research and training in the types of mathematics pertinent to the efficient exploitation and further development of high-speed automatic digital computing machinery.” INA attracted a stream of internationally recognized applied mathematicians. Harry Huskey completed the SWAC (Standards Western Automatic Computer) development project in 1950, and it became one of the very few places where modern numerical experiments could be conducted. The SWAC provided a testing ground for computer engineers, programmers and applied mathematicians. …”

(Gerald Estrin, Computer Science Department, UCLA Engineering)

As quoted, upon its 1950 completion the SWAC computer at UCLA became “one of the very few places where modern numerical experiments could be conducted”.

The National Bureau of Standards wanted computers for practical needs, and so SWAC and its Eastern sibling SEAC were quickly built, completed even before von Neumann’s IAS computer in Princeton:

“The SEAC (Standards Eastern Automatic Computer) and the SWAC (Standards Western Automatic Computer) were built by the National Bureau of Standards in Washington and Los Angeles respectively. Both of these computers were designed to be completed rapidly in order to satisfy the computing needs of the National Bureau of Standards until either the IAS machine or the UNIVAC was completed. … In May of 1950 the SEAC became the first post-ENIAC American computer to be fully operational. … SWAC was completed by July 1950…”

(Louis A. Girifalco, Dynamics of Technological Change, 1991, Van Nostrand Reinhold)

Von Neumann, on the other hand, was the leader of a larger computer-building movement at universities and scientific institutions in the U.S. and internationally, distributing his computer design plans far and wide:

“At Cambridge, computer development was led by Maurice Wilkes, who constituted the staff of the “University Mathematical Laboratory,” which was founded in 1937 to tend the university’s Differential Analyzer. In May of 1946, Wilkes read von Neumann’s “First Draft on the EDVAC,” which he obtained from L. J. Comrie who had just visited the United States. Wilkes was invited to the Moore School 1946 summer lectures and returned home determined to build an electronic computer. … He called his machine the EDSAC (Electronic Delay Storage Automatic Calculator). …

EDSAC also led to the first computer to be used for commercial data processing. …

In spite of these great achievements, the future course of digital computers was largely determined in the United States. The machines built immediately after the ENIAC that set this course were the EDVAC (1952), the IAS machine of von Neumann (1951), the SEAC (1950) and SWAC (1950) of the Bureau of Standards, the REA 1101 (1950), the UNIVAC (1951), and Whirlwind (1951). Although the EDVAC was the last of this group to become operational, its design was well known and had a profound influence on all post-ENIAC machines, as has been already noted.

The Institute for Advanced Study machine was funded by the Army through the efforts of von Neumann and Goldstine. Von Neumann’s great prestige and widespread contacts ensured that the IAS machine would be widely used. In fact, von Neumann’s original plan was to distribute design plans to a number of institutions as they were developed so that the other organizations could rapidly build copies. A number of copies, with some alterations and improvements, were actually built at various institutions, including the Rand Corporation, the Los Alamos National Laboratory, the Argonne National Laboratory, and the University of Illinois. All of these were paid for by the United States government. Von Neumann’s machines were not limited to the United States. Several versions were built abroad, including the SILLIAC in Australia.”

(Louis A. Girifalco, 1991, Van Nostrand Reinhold)

The Institute for Advanced Study’s list of historical computers influenced by von Neumann is longer and includes international ones such as in Stockholm, Moscow, Munich and Sydney:

“Differences over individual contributions and patents divided the group at the Moore School. In keeping with the spirit of academic enquiry, von Neumann was determined that advances be kept in the public domain. ECP progress reports were widely disseminated. As a consequence, the project had widespread influence. Copies of the IAS machine appeared nationally: AVIDAC at Argonne National Laboratory; ILLIAC at the University of Illinois; JOHNNIAC at RAND Corporation; MANIAC at Los Alamos Scientific Laboratory; ORACLE at Oak Ridge National Laboratory; ORDVAC at Aberdeen Proving Grounds; and internationally: BESK in Stockholm; BESM in Moscow; DASK in Denmark; PERM in Munich; SILLIAC in Sydney; and WEIZAC in Rehovoth, to mention a few. …”

(“Electronic Computer Project”, Institute for Advanced Study)

In the lists in the above two quotes, the U.S. institutions building computers, many following von Neumann’s design, included National Bureau of Standards’ SWAC at UCLA, RAND’s JOHNNIAC, and MANIAC at the Los Alamos national lab, but none at UC Berkeley or the nearby Lawrence Berkeley and Lawrence Livermore national labs.

With UCLA having a computer built by the National Bureau of Standards, wasn’t UC Berkeley in an obviously inferior position when von Neumann cast his eyes on moving to the University of California to focus on computer research and development?

Yes and no.

UC Berkeley was building CALDIC, the first computer built in the San Francisco Bay Area of Northern California and the first by a university on the West Coast, although compared to those in the Los Angeles region it was a modest one, credited mostly for educational training and completed only in 1954:

“In the immediate aftermath of World War II, the nascent West Coast computer industry was concentrated in Los Angeles. A number of Southern California aerospace firms received military funding to develop computers, many of which were meant to support aircraft design and research (Eckdahl, Reed, and Sarkissian, 2003; Norberg, 1976). The CALDIC (California Digital Computer) built at UC Berkeley in 1954 was the first computer developed in the Bay Area and the first computer developed at a West Coast university. Supported by the Office of Naval Research, in 1948 Professors Paul Morton (EE), Leland Cunningham (astronomy), and Richard Lehmer (mathematics) began building the CALDIC, which was meant to be an intermediate-size computer (Hoagland, 2010: 15). Like many university-developed computers during this period, the CALDIC was not commercialized, nor were any patents issued on the results of the work. Instead, the project’s main contribution to the local economy was the graduate students supported by the project, several of whom later became industry leaders.

For example, Albert Hoagland, Roy Houg, and Louis Stevens worked on CALDIC’s magnetic data storage system and on graduation joined the newly formed IBM research laboratory that had been established in San Jose in 1956 (Flamm, 1988: 20ff). … IBM’s San Jose Laboratory soon became a global center for digital magnetic storage innovation. … Another CALDIC PhD student, Douglas Engelbart, went to the Stanford Research Institute and developed some of the cornerstones of personal computing such as the mouse, “windowed” user interfaces, and hypertext (Bardini, 2000). Students trained through the CALDIC project thus emerged as leading industrial researchers in the Bay Area computer industry of the 1960s.”

(Martin Kenney and David C. Mowery, eds., Public Universities and Regional Growth: Insights from the University of California, 2014, Stanford Business Books)

But according to Douglas Engelbart, cited above as the a Berkeley Ph.D. student in the CALDIC project, that computer wasn’t really completed at the time of his Ph.D. graduation in 1955, Berkeley wasn’t receptive to a creative pioneer like him on its faculty, and he soon left:

“… After completing his B.S. in Electrical Engineering in 1948, he settled contentedly on the San Francisco peninsula as an electrical engineer at NACA Ames Laboratory (forerunner of NASA).

He began to envision people sitting in front of cathode-ray-tube displays, “flying around” in an information space where they could formulate and portray their thoughts in ways that could better harness their sensory, perceptual and cognitive capabilities which had heretofore gone untapped. And they would communicate and collectively organize their ideas with incredible speed and flexibility. So he applied to the graduate program in electrical engineering at the University of California, Berkeley, and off he went to launch his crusade. At that time, there was no computer science department and the closest working computer was probably on the eastern side of the country, with MIT’s Project Whirlwind. Berkeley did have a serious R&D program developing a general-purpose digital computer, the CalDiC, but remained unfishined throughout his time there.

He obtained his Ph.D. in 1955, along with a half dozen patents in “bi-stable gaseous plasma digital devices,” and stayed on at Berkeley as an acting assistant professor. Within a year, however, he was tipped off by a colleague that if he kept talking about his “wild ideas” he’d be an acting assistant professor forever. So he ventured back down into what is now Silicon Valley, in search of more suitable employment.

He settled on a research position at Stanford Research Institute, now SRI International, in 1957. …”

(“A Lifetime Pursuit”, by Christina Engelbart, Doug Engelbart Institute)

Before settling on a career at Stanford Research Institute beginning in 1957, Engelbart also approached both Stanford University and Hewlett-Packard – where the future Silicon Valley had begun – and learned that neither had plans for computer development:

Were you working with computers at Ames in 1951?

No! The closest computer was somewhere on the east coast.

I left Ames to attend the University of California and received a Ph.D., and taught there for a year. …

So I contacted the Dean of the School of Engineering at Stanford, just across the bay from Berkeley, and received a nice letter that said something like, “Dear Dr. Engelbart. Thank you for your interest in Stanford. Unfortunately, our School of Engineering is a small department, and we have chosen to focus only on those areas which we feel offer real potential. Since computers are only useful to service entities, we have no interest in developing a focus in them. Best of luck, etc.”

Wow – I can’t believe they were so myopic! It’s hard to believe that an institution so closely allied with the birth of Silicon Valley could have missed that one…

They weren’t alone! I also spoke with David Packard (of Hewlett-Packard). We had a great conversation, and I was all set to work for them. Then, as I was driving home from the interview, a question forced its way into my mind. About a quarter of the way home, I stopped and called the vice president of engineering at HP I was going to work for, and asked, “I assume HP is planning on going into digital instruments and digital computers, and I’ll get a chance to work in those areas, right?” And he replied that they didn’t think there was much potential there, so the answer was no.”

(“Doug Engelbart: Father of the Mouse”, by Andrew Maisel, editor-in-chief, SuperKids Educational Software Review)

Engelbart’s early experiences of discouragement were a remarkable contrast to his subsequent pioneering work and achievements, which later in 1997 won him the A. M. Turing Award – computer science’s highest honor as mentioned in Part 3:

“… He is not a computer scientist, but an engineer by training and an inventor by choice. His numerous technological innovations (including the computer mouse, hypertext, and the split screen interface) were crucial to the development of personal computing and the Internet. …”

(“DOUGLAS ENGELBART”, A. M. Turing Award, Association for Computing Machinery)

Amazing, an engineer with pioneering ideas for computers was ignored by the fledgling Silicon Valley’s leading engineering centers, Hewlett-Packard, Stanford and Berkeley, but went on to success, receiving the computer science community’s top honor.

During that 1950s in Southern California, including Los Angeles, the military-oriented aerospace technology companies were very active in computer research, receiving substantial funding from the U.S. military:

“One other pocket of activity, in historical hindsight, looms in importance as a transporter of computer technology from laboratory to market. Located on the West Coast of the United States and tied closely to the aerospace industry in Southern California, which, in turn, was very dependent on government contracts, this activity focused on scientific and engineering computing. The design of aircraft inherently required extensive mathematical calculations, as did applications such as missile guidance. Early efforts (late 1940s) were primarily housed at Northrop Aircraft and to a lesser extent at Raytheon. Both had projects funded by the U.S. government: Northrop for its Snark missile and Raytheon for a naval control processor, for example. Northrop worked with an instrument supplier (Hewlett-Packard) on early digital projects. Then, in 1950, a group of Northrop engineers formed their own computer company called Computer Research Corporation (CRC). Like ERA, it had a military sponsor the U.S. Air Force for which it built various computers in the first half of the 1950s.”

(James W. Cortada, The Computer in the United States: From Laboratory to Market, 1930 to 1960, 1993, M.E. Sharpe)

As quoted, while Southern California’s aerospace companies were building computers, Northern California’s Hewlett-Packard was only “an instrument supplier” for some of them.

In fact, when the future leading computer maker IBM first entered the electronic computer market in the early 1950s with its IBM 701 machine, there were very few buyers and half of them were companies and organizations using the SWAC computer at INA at UCLA:

“By 1951 the demands for computational assistance were so great that it was difficult for the Computational Unit to fulfill its obligations. Accordingly, a new unit of INA, called the Mathematical Services Unit, was formed under the supervision of Harry Huskey. It was funded by the United States Air Force. One of the purposes of this unit was to encourage Federal Government contractors to learn how to use electronic computers. Accordingly, computational services using SWAC were made available to them. Many of these contractors made use of this service. Effectively, the NBS offer to these contractors was to augment their contracts by providing free computational services of a type which was not as yet available elsewhere. It is interesting to note that when IBM announced that they would build a “Defense Calculator” if they get 12 orders, 6 of the 12 orders came from INA’s computer customers. The “Defense Calculator” became the IBM 701 – their entry into the “Electronic Computer Age.””

(Magnus R. Hestenes and John Todd, Mathematicians Learning to Use Computers: The Institute for Numerical Analysis UCLA 1947-1954, 1991, National Institute of Standards and Technology, U.S. Department of Commerce)

So in 1956 when von Neumann was planning to move to California to focus on computer research, the future Silicon Valley region in Northern California, be it academically or industrially in the computer field, paled in comparison to the Los Angeles region in Southern California.

Moreover, von Neumann’s role as the U.S. Air Force’s leading scientific adviser on ICBM helped bring additional military aerospace presence to the Los Angeles region:

“In July 1955, along with von Neumann and others, Schriever had an audience with President Eisenhower in the West Wing. He explained not only the paramount importance of ICBMs and the “radical” new organization he had established near Los Angeles to develop them, but also why he had not handed the project over to commercial aircraft contractors…

“Most impressive!” Ike declared. … Eisenhower secretly ordered the Pentagon to build ICBMs with “maximum urgency.” That same summer, Schriever learned from intelligence sources how little time they had: the Soviets were already testing ­intermediate-range ballistic missiles.”

(Michael Beschloss, October 1, 2009, The New York Times)

But von Neumann was also a great mathematician and so the specifics of mathematical computing and numerical analysis could also be factors influencing his decision choosing between UCLA and UC Berkeley.

UC Berkeley had mathematics professor Richard Lehmer, cited in an earlier quote as one of the leaders of the CALDIC computer project.

I know it might not sound that big to someone of von Neumann’s ambitions. But recall, as previously quoted in Part 4, that Berkeley math professor Derrick H. Lehmer – the same person – became the director of UCLA’s Institute for Numerical Analysis at the time of the Loyalty Oath controversy in the early 1950s, when he joined 3 other Berkeley math professors among 29 tenured Berkeley professors, and 2 at UCLA, to express objection to McCarthyism:

“Wanting to show proof of loyalty, Robert Gordon Sproul, then President of the University of California, proposed the Loyalty Oath which would have all professors declare they were not and never had been communists.

Some 29 tenured professors from UC Berkeley and two from UCLA (one of whom later became a UC President) refused to sign. …

The Regents of the time mandated that all professors had to sign, or be fired. In the Mathematics Department, three professors refused: John Kelley, Hans Lewy, and Pauline Sperry. Another professor, D.H. Lehmer, attempted to avoid signing by taking a leave of absence to take a federal job at UCLA as Director at the Institute for Numerical Analysis. …”

(“Loyalty Oath Controversy: Interview with Leon Henkin”, Fall 2000, Vol. VII, No. 1, Berkeley Mathematics Newsletter)

Derrick Lehmer’s former directorship at the INA at UCLA was a strong credential in the computing field and thus an encouraging factor; John von Neumann himself had been a distinguished visitor at INA:

“INA attracted many distinguished visitors such as, John von Neumann, Solomon Lefschetz, Edward Teller, Norbert Wiener, and many others, …”

(Magnus R. Hestenes and John Todd, 1991, National Institute of Standards and Technology, U.S. Department of Commerce)

But I have to concede that the contrast of 29 Berkeley professors refusing to sign the loyalty oath to UCLA’s only 2 refusals, unfortunately, went opposite to von Neumann’s Cold War strategizing at RAND and his close collaboration with nuclear weapons scientists like Edward Teller, also a distinguished visitor at INA as cited above.

Lehmer returned to Berkeley in 1953, and in 1954 the National Bureau of Standards ended the Institute for Numerical Analysis due to the objection of the U.S. Department of Defense, leaving the SWAC computer to UCLA:

THE PERIOD SUMMER 1953 THROUGH SPRING 1954

D. H. Lehmer returned to the University of California at Berkeley in August and C. B. Tompkins took over the Directorship of INA. …

The National Bureau of Standards was a co-sponsor with the American Society of a Symposium on Numerical Analysis held at Santa Monica City College, August 26-28. John H. Curtiss was the chairman of the organizing committee. The symposium was entitled “American Mathematical Society Sixth Symposium in Applied Mathematics: Numerical Analysis.” …

… A large number of the participants had been associated with NBS and INA as visiting scientists. NBS and INA were represented by C. B. Tompkins, Olga Taussky-Todd, Emma Lehmer, M. R. Hestenes, T. S. Motzkin, and W. R. Wasow. …

David Saxon returned to his position in the Department of Physics at UCLA. He had a distinguished career in research and in administration. In 1975 he became President of the University of California. In 1983 he was appointed Chairman of the Corporation of Massachusetts Institute of Technology.

We now begin the final period of existence of INA. …

The decision of Secretary of Defense, Charles E. Wilson, to no longer permit a non-DOD Government agency to serve as administrator of projects carried out at a university but supported entirely, or in large part, by DOD funds, caused the National Bureau of Standards to give up its administration of INA by June 30, 1954. The University of California was invited to take over this administration. The university was not in a position to take over all sections of INA. However, UCLA agreed to take over the administration of the research group, the SWAC and its maintenance, and the Library. … The question of faculty status of INA members was to be dealt with after the takeover had been accomplished. The new organization was to be called Numerical Analysis Research (NAR). …”

(Magnus R. Hestenes and John Todd, 1991, National Institute of Standards and Technology, U.S. Department of Commerce)

As quoted, the INA at UCLA had been under the administration of the National Bureau of Standards, then U.S. Secretary of Defense Charles E. Wilson decided that a government non-defense agency should no longer manage projects of substantial defense funding, and so the INA as administered by the NBS and funded by the ONR had to end.

Nevertheless, I note that the physicist David Saxon mentioned above was one of those only 2 UCLA professors refusing to sign the McCarthy-era loyalty oath, one of whom later a UC president as quoted earlier: Saxon was the future president of the University of California encompassing UCLA, UC Berkeley and many other campuses.

(“David S. Saxon, 85; Physicist Forced Out in McCarthy Era Later Led UC System in a Time of Tight Budgets”, by Elaine Woo, December 9, 2005, Los Angeles Times; and, “IN MEMORIAM: David Stephen Saxon: President Emeritus University of California: 1920 – 2005”, by Richard C. Atkinson, The University of California)

And I am struck by the contrast between the title of the above-quoted 1991 book by Magnus R. Hestenes and John Todd on the history of NBS’s INA at UCLA in 1947-1954, “Mathematicians Learning to Use Computers”, and the title of a The New York Times article discussed in Part 4 about the 1986 International Congress of Mathematicians held at UC Berkeley featuring my Ph.D. Adviser Stephen Smale as the leading plenary speaker, “MATHEMATICIANS FINALLY LOG ON”.

Mathematicians were “learning to use computers” in the 1940s and 1950s, and yet it took them until the 1980s to “finally log on”!

A timeline, in the context of the facts reviewed earlier, seemed to be: in the late 1940s and early 1950s mathematicians were “learning to use computers” at the Institute for Numerical Analysis run by the National Bureau of Standards at UCLA, until 1954 when the institute was terminated due to a Pentagon decision; in 1956 John von Neumann, a leading mathematician and the “father of computers”, was deciding on which UC campus to move to for computer research and chose UCLA, which no longer held a dominant lead over UC Berkeley but his Cold War think-tank RAND was nearby which had also followed his computer design; in 1957 von Neumann died of cancer at an early age of 53; then it took another 3 decades for mathematicians to “finally log on” computers.

I know my timeline appears stretching the facts, namely that the death of one mathematician, as great as von Neumann might be, could have such a devastating impact on the history of mathematicians’ acquaintance with computers.

But there were other intriguing and tell-tale facts hinting at a similar timeline.

Within a small number of years following the the world’s leading gathering of mathematicians in 1986 at Berkeley, as mentioned in Part 4, several Berkeley mathematicians in the computational fields who influenced my study there, Richard Karp, Stephen Smale, Andrew Majda, William Kahan, and also the University of Wisconsin’s Carl de Boor, all mentioned in Part 4, received the John von Neumann Lecture honor of the Society for Industrial and Applied Mathematics:

“The John von Neumann lecturers:

  • 1986 Jacques-Louis Lions
  • 1987 Richard M. Karp
  • 1988 Germund G. Dahlquist
  • 1989 Stephen Smale
  • 1990 Andrew J. Majda
  • 1992 R. Tyrrell Rockafellar
  • 1994 Martin D. Kruskal
  • 1996 Carl de Boor
  • 1997 William (Velvel) Kahan

…”

(“The John von Neumann Lecture”, Society for Industrial and Applied Mathematics)

Not only that starting in 1987 these Berkeley professors received the John von Neumann Lecture honor, but that none of the prior recipients in the SIAM’s list cited above, starting from 1960 and including mathematicians, physicists and other scientists, had been a Berkeley recipient as far as I know.

So, like The New York Times said mathematicians “finally log on” at the 1986 ICM held at UC Berkeley, and then the UC Berkeley mathematicians, i.e., those whose work facilitated it, finally receive an honor named for John von Neumann – now imagine if von Neumann himself had been living and leading in the missing decades!

Recall an anecdote told in Part 4, that in the fall of 1983 when I was contemplating doing Ph.D. study with the mathematician Smale or the numerical analyst Andrew Majda, Majda commented to me that Smale “knows nothing about numerical analysis”; I note here that Majda had in fact been a professor at UCLA before Berkeley, and was moving to Princeton where von Neumann had been famous.

But despite Majda’s opinion, SIAM awarded Smale the John von Neumann Lecture honor one year before it accorded Majda. It illustrated that the contributions to the computational fields by the more pure math-inclined mathmaticians like Stephen Smale were appreciated by the applied mathematics community.

Back in August 1953 when the National Bureau of Standards and the Institute for Numerical Analysis hosted in Santa Monica an event titled, “American Mathematical Society Sixth Symposium in Applied Mathematics: Numerical Analysis” – as earlier quoted from the book by Hestenes and Todd – it was a first for computing: the AMS Symposia in Applied Mathematics had begun in the late 1940s and this 6th symposium was the first to devote to computational issues.

(“Proceedings of Symposia in Applied Mathematics”, AMS eBook Collections, American Mathematical Society)

In the summer of 1953 John von Neumann was in fact the outgoing AMS president, serving 1951-1953. At the time, von Neumann’s nuclear science expertise was giving him increasingly prominent national responsibilities: he became a General Advisory Committee member of the U.S. Atomic Energy Commission in 1952, and a member of the Technical Advisory Panel on Atomic Energy in 1953.

(Herman H. Goldstine, The Computer from Pascal to von Neumann, 1972, Princeton University Press)

The INA’s closure in 1954 was a major setback to the mathematical computing field, but John von Neumann was moving to the top in the nuclear arena, appointed a U.S. Atomic Energy Commissioner by President Dwight Eisenhower – and on that job for only 6 months before a diagnosis that he had cancer:

“In October 1954 Eisenhower appointed Von Neumann to the Atomic Energy Commission. Von Neumann accepted, although the Air Force and the senators who confirmed him insisted that he retain his chairmanship of the Air Force ballistic missile panel.

Von Neumann had been on the new job only six months when the pain first struck in the left shoulder. After two examinations, the physicians at Bethesda Naval Hospital suspected cancer. Within a month Von Neumann was wheeled into surgery at the New England Deaconess Hospital in Boston. A leading pathologist, Dr. Shields Warren, examined the biopsy tissue and confirmed that the pain was a secondary cancer. Doctors began to race to discover the primary locations. Several weeks later they found it in the prostate. Von Neumann, the agreed, did not have long to live.”

(Clay Blair, Jr., February 25, 1957, Volume 42, Number 8, Life)

When the INA was closing, most of its scientists and engineers left for jobs elsewhere, including going to the industry and to RAND, while a few went to UC faculty jobs:

“At this time, a Numerical Analysis section was set up at NBS in Washington with John Todd as Chief and with, on a smaller scale, a mission similar to that of INA. …

The engineers resigned on November 1, 1953 and accepted positions with the Magnavox Corporation. …

By June 30, 1954, various members of INA had accepted positions in industry and in various departments of universities. For example, B. Handy, A. D. Hestenes, M. Howard, and E. C. Yowell were employed by National Cash Register. S. Marks and A. Rosenthal went to the Rand Corporation. …

During his leave of absence from INA, Lanczos was employed by North American Aviation as a specialist in computing. In 1954 at the invitation of Eamon de Valera, who was at that time Prime Minister of the Republic of Ireland, Lanczos accepted the post of Senior Professor in the School of Theoretical Physics of the Dublin Institute for Advanced Studies. …

In 1954 Harry Huskey accepted a faculty position at UC-Berkeley, where he continued to make significant contributions in the computer field. In 1967 he moved to UC-Santa Cruz to serve as Professor of Computer and Information Science. There he set up the USCS Computer Center and served as its Director from 1967-1977. Internationally, he was in great demand as a consultant to various computer centers, e.g., centers in India, Pakistan, Burma, Brazil, and Jordan. …

Charles B. Tompkins became a member of the Department of Mathematics at UCLA. He was in charge of the NAR Project … He continued to make the computing facility available to all interested faculty and students. …”

(Magnus R. Hestenes and John Todd, 1991, National Institute of Standards and Technology, U.S. Department of Commerce)

The above quote noted that the National Bureau of Standards had not had a numerical analysis group in the U.S. Capital until its INA in Los Angeles was shutting down.

Another key fact cited above was that Harry Huskey, quoted earlier as leader of the SWAC computer development and leader of computer training for U.S. government contractors, moved to UC Berkeley in 1954.

Huskey had been an original member of the first electronic computer ENIAC project and met von Neumann there although as he later recalled, he and his fellow ENIAC engineers did not have a high opinion of von Neumann because the latter did not pay attention to details:

“… I heard that there were projects at the electrical engineering department of the Moore School at the university, and I applied for part time work. Since their work was classified they couldn’t tell me what they were doing, so I had no idea what I would be doing. When finally clearance came I was the showed the ENIAC and I’ve worked in computers ever since.

… The von Neumann report was not helpful, in my opinion. So I think the answer– well, we had general meetings in which von Neumann participated. And I think the people who were actually working on the project took the feeling that, “Well, he doesn’t worry about the details. He waves his hand.” That sort of position. …”

(“Oral History of Harry Huskey”, interview by William Aspray, February 7, 2006, Computer History Museum)

So later in 1956 when von Neumann contemplated about UCLA or UC Berkeley to move to, a key computer development leader who had moved from the INA at UCLA to Berkeley was not so positive about him.

In 1954 it was Derrick Lehmer and also Paul Morton, cited earlier as leaders of the CALDIC computer project, who offered Huskey his Berkeley professorship in both mathematics and electrical engineering departments:

“The fact that INA was a project under the Bureau of Standards caused it to be terminated as a Bureau project. The SWAC computer was given to the Engineering Department of UCLA, and the mathematical research part of INA was set up as a project in the Math Department of UCLA. And so that ended that phase of things. I had gone on leave to Wayne University with Jacobson, with the charter to set up a computer center there, and so I spent the year working on that, and when I came back to the Bureau, all this other stuff had happened. So the question was, what is the future? And at that point, Lehmer and Paul Morton at Berkeley offered me a position, so I took that. It was an associate professorship.

It was half math and half EE, and so on July 1st of that year, I moved to Berkeley. That’s about the whole story.”

(interview by William Aspray, February 7, 2006, Computer History Museum)

As mentioned earlier, Berkeley’s CALDIC computer project was reportedly completed in 1954 but then Douglas Engelbart later said it was still not when he graduated in 1955. So it is possible that Huskey’s arrival helped finish it.

Like Huskey, Lehmer’s association with the computer and von Neumann had come earlier; in 1945-1946 Lehmer was a member of the Computations Committee planning ENIAC’s use at the U.S. Army’s Ballistic Research Laboratory at the Aberdeen Proving Ground:

“… A Computations Committee had been established in 1945 to provide a group of experts to plan for the arrival of ENIAC at the BRL and to see that it would be applied productively. The members of the Computations Committee included the mathematicians Haskell Curry, Franz Alt, and Derrick Lehmer and the astronomer Leland Cunningham. All of them had come to Aberdeen during the war to assist with the BRL’s computational work, and they retained a connection with the lab for several years afterward—some as employees, others as frequent visitors. …”

(Thomas Haigh, Mark Priestley and Crispin Rope, ENIAC in Action: Making and Remaking the Modern Computer, 2016, The MIT Press)

The Aberdeen Proving Ground, as previously mentioned in a quote in Part 4, was a military research facility where Berkeley math professor Hans Lewy had worked during World War II, who later was one of Lehmer’s fellow objectors to the McCarthy-era UC loyalty oath.

Lehmer had strong historical credentials for overseeing ENIAC computing; he had been a pioneer in building electro-mechanical computing devices:

“Lehmer made contributions to many parts of number theory, but he was especially interested in relevant numerical calculations. He was unsurpassed in this field. …

While still an undergraduate, Lehmer realized that it would be helpful to have a mechanical device for combining linear congruences, and at various times, he supervised the construction of several such machines. These special-purpose computers, known as sieves, were particularly useful in factoring large numbers. The first model, constructed in 1927, used 19 bicycle chains. …

In 1932, an improved sieve was constructed and displayed at the 1933 World’s Fair in Chicago. Here, instead of bicycle chains, disk gears with various numbers of teeth were used, with holes opposite each tooth. For a given problem, the unwanted holes were plugged, and a photoelectric cell was used to stop the machine when open holes were lined up. …

Lehmer was a pioneer in the development of modern computing machines and in their use in the solution of scientific problems, particularly those arising in number theory. In 1945-46 he was called to the Ballistic Research Laboratory of the Aberdeen Proving Ground to prepare that laboratory for the installation of the ENIAC, the first general-purpose electronic computer. he observed the completion of that computer in Philadelphia and took part in its testing in Aberdeen. …”

(“Derrick H. Lehmer, Mathematics: Berkeley: 1905-1991 Professor Emeritus”, John L. Kelley, Raphael M. Robinson, Abraham H. Taub, and P. Emery Thomas, 1991, University of California)

I note that the Lehmer sieves built in the 1920s and 1030s as explained above could only do specific mathematical computations – unlike later ENIAC, the first “general-purpose” electronic computer.

As a mathematician, Lehmer’s passion was in computations for number theory, and even on ENIAC as in the following anecdote of a July 4 holiday weekend he and his wife Emma chose to spend on ENIAC:

Lehmer’s Holiday Computations

Another well-documented calculation from 1946 was carried out by the Berkeley number theorist Derrick Lehmer. Lehmer spent the year 1945-46 at the Ballistic Research Lab as a member of a group helping to plan for ENIAC’s use. He experimented with the machine by running “little problems” when it was otherwise not in use. …

As Derrick Lehmer later recounted, he and his family descended on ENIAC over the July 4 weekend, a weekend during which very little work is done in the United States. (Lehmer’s wife, Emma, was a noted mathematician who did much of the computational work required to get ENIAC’s output from this visit into publishable form.) With help from John Mauchly, they were allowed to “pull everything off the machine” and set up their own problem.

Lehmer credited Mauchly with the idea of implementing a sieve on ENIAC. Lehmer’s program, as partially reconstructed by the historians Maarten Bullynck and Liesbeth de Mol, made use of ENIAC’s ability to perform several parts of a computation at once. In the reconstruction, fourteen accumulators were used to simultaneously test a single number against different prime numbers. Lehmer’s paper does not provide enough information to make it certain that his original implementation exploited that technique, but in discussing the computation he later complained that ENIAC “was a highly parallel machine, before von Neumann spoiled it.” …”

(Thomas Haigh, Mark Priestley and Crispin Rope, 2016, The MIT Press)

While doing research in parallel computation beginning in the mid-late 1980s, I became familiar with the term, “the von Neumann bottleneck” – related to something Lehmer said in the above quote – coined by IBM computer scientist John Backus:

“… What is a von Neumann computer? When von Neumann and others conceived it over thirty years ago, it was an elegant, practical, and unifying idea that simplified a number of engineering and programming problems that existed then. Although the conditions that produced its architecture have changed radically, we nevertheless still identify the notion of “computer” with this thirty year old concept.

In its simplest form a von Neumann computer has three parts: a central processing unit (or CPU), a store, and a connecting tube that can transmit a single word between the CPU and the store (and send an address to the store). I propose to call this tube the von Neumann bottleneck. …

… Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. …”

(“Can Programming Be Liberated from the von Neumann Style? A Functional Style and Its Algebra of Programs”, by John Backus, 19 77 ACM Turing Award Lecture, Association for Computing Machinery)

Whatever the limitations of the von Neumann computer design, back in the 1940s and 1950s on the early computers the kind of mathematical delicacies Lehmer enjoyed was the opposite of the norm, while the norm was serious military research led by von Neumann, especially for atomic bomb development:

“But ENIAC’s more profound contributions to advances in military science and technology came with Cold War work that would have been prohibitively expensive to attempt by hand. ENIAC simulated explosions of atomic and hydrogen bombs, airflow at supersonic speeds, and designs for nuclear reactors. With the considerable assistance of John von Neumann, it established the digital computer as a vital tool within the emerging military-industrial-academic complex carrying out cutting-edge research and development work during the early years of the Cold War. A few years later, IBM launched its first commercial computer, the Model 701, as the “defense calculator” and sold it almost exclusively to defense contractors. The United States Government even managed the delivery queue for IBM, making sure that computers were dispatched first to the firms doing the most important work.”

(Thomas Haigh, Mark Priestley and Crispin Rope, 2016, The MIT Press)

As the above quote indicates, in the 1940s-1950s military priorities were the highest of the “cutting-edge research and development work”, and allocation of computer use was centrally managed by the U.S. government – be it for academic, industrial or commercial usage.

From this perspective, one could refer to it as U.S. government generosity that for 7 years, 1947-1954, mathematicians got to go to the Institute for Numerical Analysis at UCLA to learn to use the SWAC computer run by the National Bureau of Standards – in 2 of the years even under the directorship of Derrick Lehmer, an objector to the UC Loyalty Oath – before Secretary of Defense Charles Wilson pulled the plug in 1954.

The historian of science Liesbeth De Mol has done a comparison showing the contrasting mathematical focuses of Lehmer and von Neumann, i.e., Lehmer’s pure mathematics interests versus von Neumann’s applied mathematics ambitions.

De Mol wrote of Derrick Lehmer the mathematician:

“Derrick H. Lehmer (1905-1991) was born into number theory. His father, Derrick N. Lehmer, was a number-theorist, known for his factor table up to 10,000,000 and his stencil sheets to find factors of large numbers. …

Throughout Lehmer’s papers one finds numerous statements about the experimental character of mathematics and more specifically number theory, which he regarded as a kind of observational science. It is exactly in this context that one should understand Lehmer’s interest in computers. He regarded them as instruments to experimentally study mathematics … Already as a young boy, Lehmer began to design and build small special-purpose machines, known as sieves, to assist him in his number-theoretical work.

When World War II began, Lehmer “got involved into war work mostly having to do with the analysis of bombing…”. He built a special-purpose machine, a “bombing analyzer [which] was a combination of the digital and the analog device. […] I demonstrated it in Washington one time at the Pentagon. […] This thing was Army Ordnance, I guess. …”. Just after the war Lehmer was called upon by the Ballistic Research Laboratories (Aberdeen Proving Ground) to become a member of the ‘Computations Committee’, which was assembled to prepare for utilizing the ENIAC after its completion …”

(“Doing Mathematics on the ENIAC. Von Neumann’s and Lehmer’s different visions”, by Liesbeth De Mol, in E. Wilhelmus, I. Witzke (eds.), Mathematical practice and development throughout History, 2008, Logos Verlag)

In short, Lehmer was an experimentally and computationally oriented pure mathematician, who also proved his abilities in his wartime work for the military.

De Mol wrote of John von Neumann the mathematician:

“John von Neumann is far more famous than D.H. Lehmer, not in the least because the hardware of computers nowadays is still referred to as ‘the von Neumann architecture’. He was a mathematician by education and made major contributions in many different fields, including: mathematical logic, set theory, economics and game theory, quantum mechanics, hydrodynamics, computer science,…

Von Neumann’s acquaintance with the field of mathematical logic had a major influence on his work on computers. …

It was not his interest in logic, however, that triggered his interest in the subject. … Ulam explains why von Neumann got interested in computers …:

It must have been in 1938 that I first had discussions with von Neumann about problems in mathematical physics, and the first I remember were when he was very curious about the problem of mathematical treatment of turbulence in hydrodynamics. […] He was fascinated by the role of Reynolds number, a dimensionless number, a pure number because it is the ratio of two forces, the inertial one and the viscous […] [von Neumann] […] wanted to find an explanation or at least a way to understand this very puzzling large number. […] I remember that in our discussions von Neumann realized that the known analytical methods, the method of mathematical analysis, even in their most advanced forms, were not powerful enough to give any hope of obtaining solutions in closed form. This was perhaps one of the origins of his desire to try to devise methods of very fast numerical computations, a more humble way of proceeding. Proceeding by “brute force” is considered by some to be more lowbrow. […] I remember also discussions about the possibilities of predicting the weather at first only locally, and soon after that, about how to calculate the circulation of meteorological phenomena around the globe.

Von Neumann got particularly interested in computers for doing numerical calculations in the context of theoretical physics and thus understood, quite early, that fast computing machines could be very useful in the context of applied mathematics.”

(Liesbeth De Mol, in E. Wilhelmus, I. Witzke (eds.), Mathematical practice and development throughout History, 2008, Logos Verlag)

As De Mol described, von Neumann was a pure mathematician, but more importantly an applied mathematician ambitious for real-world applications.

It was von Neumann’s applied-math research in fluid dynamics that led to his participation in the atomic bomb development, for which he began searching for available computing power, actively surveying the existing state-of-the-art calculating machines, as De Mol described:

“In 1943, during World War II, von Neumann was invited to join the Manhattan project – the project to develop the atomic bomb – because of his work on fluid dynamics. He soon realized that the problems he was working on involved a lot of computational work which might take years to complete. He submitted a request for help, and in 1944 he was presented a list of people he could visit. He visited Howard Aiken and saw his Harvard Mark I (ASCC) calculator. He knew about the electromechanical relay computers of George Stibitz, and about the work by Jan Schilt at the Watson Scientific Computing Laboratory at Columbia University. These machines however were still relatively slow to solve the problems von Neumann was working on. …”

(Liesbeth De Mol, in E. Wilhelmus, I. Witzke (eds.), Mathematical practice and development throughout History, 2008, Logos Verlag)

So, even taking into account INA’s 1954 closure reducing UCLA’s strength in computing before von Neumann deciding in 1956 to go to UCLA or Berkeley, the two former INA computational mathematicians who returned to or moved to Berkeley, Lehmer and Huskey, were not the compatible types for von Neumann.

In the spirit of my review in Part 4 of some Berkeley mathematicians, I would like to view the contrast between Lehmer and von Neumann – articulated by Liesbeth De Mol – as an older-generation phenomenon prior to the contrast between Stephen Smale and Alexander Chorin in the 1970s and 1980s: Smale’s anti-war politics was more outspoken and higher-profile than Lehmer’s expression of objection to McCarthyism, whereas Chorin, instrumental in forming a faculty group in numerical analysis specializing in fluid dynamics, affiliated with the Lawrence Berkeley national lab and funded by military research agencies, was probably not quite von Neumann’s caliber.

Chorin’s former Ph.D. adviser Peter Lax, of New York University’s Courant Institute of Mathematical Sciences, had in fact become a protégé of von Neumann’s while a teenager before university; later during the Manhattan Project, Lax worked in the Los Alamos national lab and got his start in the subject of fluid dynamic shock waves – in the fall of 1983 Andrew  Majda introduced me to the subject as in Part 4 – there through von Neumann’s introduction.

(“NYU’s Peter Lax Wins ‘Nobel Prize of Mathematics’”, by Gary Shapiro, March 23, 2005, The New York Sun)

While the Polish-born Jewish Chorin as an incoming NYU Ph.D. student was initially mistaken by his adviser Peter Lax for a Hungarian compatriot, as in a tale told in Part 4, von Neumann had been the unmistakable Hungarian Jewish genius – the only Hungarian genius according to Nobel laureate Eugene Wigner:

“… Five of Hungary’s six Nobel Prize winners were Jews born between 1875 and 1905, and one was asked why Hungary in his generation had brought forth so many geniuses. Nobel laureate Wigner replied that he did not understand the question. Hungary in that time had produced only one genius, Johnny von Neumann.”

(Norman MacRae, John Von Neumann: The Scientific Genius Who Pioneered the Modern Computer, Game Theory, Nuclear Deterrence, and Much More, 1992, Pantheon Books)

As my review so far has shown, in 1956 when von Neumann planned to move to California, if his intent was to focus on computer research as described in Norman MacRae’s book, rather than on nuclear science, then the Los Angeles region of UCLA was stronger in that respect and more conducive for his interests politically, industrially and academically, than UC Berkeley and the nascent Silicon Valley.

But the year 1956, tantalizingly, also was when some things began to happen in favor of the future Silicon Valley.

One of the happenings was that IBM established a research laboratory in San Jose, the future Silicon Valley’s largest city, as quoted earlier, and several Berkeley CALDIC computer project students had their industry-leading career start there, working on digital magnetic storage systems.

I understand that such computer peripherals might not be much for a prominent computer pioneer and ambitious scientific leader like John von Neumann. But IBM had held, and would continue to hold, von Neumann in high regard.

Following von Neumann’s death, Herman Goldstine, his former collaborator at the ENIAC project and his deputy at the Princeton IAS computer project, became the founding director of the Mathematical Sciences Department at IBM’s central research organ, Thomas J. Watson Research Center in New York state:

“… long before the Eniac was running it was obvious it had several major design defects. The gargantuan machine, weighing 30 tons and containing 18,000 electronic tubes, took several days to program and could store just 20 numbers. A study group for an improved machine, to be called the Edvac (Electronic Discrete Variable Automatic Computer), was established, consisting of Goldstine, Mauchly, J. Presper Eckert (Eniac’s principal engineer) and Arthur Burks (a mathematical logician). The group was shortly joined by John von Neumann.

In June 1945, von Neumann wrote the seminal Edvac Report, whose wide circulation established the new computing paradigm and ultimately the worldwide computer industry. Von Neumann’s sole authorship of the report, and his towering reputation as America’s leading mathematician, completely overshadowed the contributions of the others in the group, causing deep resentment in Eckert and Mauchly.

At the end of the war the group broke up because of these tensions. Eckert and Mauchly formed the computer company that eventually became today’s Unisys Corporation, while von Neumann, Goldstine and Burks went to the Institute for Advanced Study (IAS), Princeton University, to build a computer in an academic setting. Goldstine was appointed assistant director of the computer project, and director from 1954. In addition he co-wrote with von Neumann a set of reports, Planning and Coding of Problems for an Electronic Computing Instrument (1947) that established many early ideas in computer programming.

The IAS computer was an important design influence on the early computers of IBM, for whom von Neumann was a consultant. In 1958, following von Neumann’s death and the termination of the IAS computer project, Goldstine became the founding director of the Mathematical Sciences Department at IBM’s Watson Research Center in Yorktown Heights, New York.”

(“Herman Goldstine: Co-inventor of the modern computer and historian of its development”, by Martin Campbell-Kelly, July 4, 2004, The Independent)

As described, the leading architects of the original electronic computer ENIAC, Presper Eckert and John Mauchly, subsequently took an entrepreneurial route, forming a commercial company to further computer development, whereas von Neumann led Goldstine and several others starting the IAS computer project at the Institute for Advanced Study in Princeton – a project that not only led to proliferation of computer development in academic and scientific institutions as discussed earlier, but also had important design influence on IBM computers, with von Neumann himself a consultant for IBM.

Von Neumann had not been a founding member of the ENIAC project; it was Goldstine who had started the project on behalf of the U.S. Army, and then invited von Neumann’s participation in 1944:

“While there are challengers for the title of “first computer,” the dedication of ENIAC on Feb. 15, 1946, is widely accepted as the day the Information Age began. And like the Declaration of Independence in Philadelphia 170 years before, it declared a revolution.

Dr. Goldstine — now 82 and executive officer of the American Philosophical Society in Philadelphia — recalls arriving at Aberdeen in 1942 as a newly commissioned lieutenant in the Army Air Corps. He had just been pulled out of his squadron when the Army realized that it had better uses for a Ph.D. mathematician from the University of Chicago.

At Aberdeen, Lieutenant Goldstine was given the mission of speeding up the calculation of firing tables needed for accurate artillery and the charts needed for bombing runs. At the time, the necessary math was done by a group of young women using mechanical desk calculators. The system wasn’t working.

In the process of consulting with university experts, Lieutenant Goldstine met a 32-year-old physicist named John W. Mauchly, who outlined his idea for an all-electronic digital computer that could perform computations 1,000 times faster than a human.

Lieutenant Goldstine was intrigued, so he took the idea back to his boss, Lt. Col. Paul Gillon. He gave the project both his approval and its name — Electronic Numerical Integrator and Computer.

ENIAC was designed and built at the Moore School by a team led by Dr. Mauchly and J. Presper Eckert, an engineer in his early 20s. The newly promoted Captain Goldstine ran interference with the Army brass and contributed his own considerable expertise, says Paul Deitz, a civilian official at Aberdeen who is an unofficial historian of the ENIAC project.

In 1944, soon after the first part of ENIAC was completed, Dr. Goldstein had a chance meeting at the Aberdeen train station with John L. von Neumann, one of the leading mathematicians of his day and an adviser to the Ballistic Research Laboratory at the proving ground.

Dr. Goldstein recalls that when he told Dr. von Neumann about the ENIAC project, “he suddenly became galvanized.” It turned out that Dr. von Neumann had been working on a project in Los Alamos, N.M., that required high-power computing.”

(“Computer age had clumsy start Electronic era: Born 50 years ago, the ancestor of today’s PCs and calculators was slow, unreliable and weighed 30 tons”, by Michael Dresser, February 12, 1996, The Baltimore Sun)

Clearly, had von Neumann gone to the San Francisco Bay Area in the mid-late 1950s the newly founded IBM San Jose research laboratory would have been privileged to receive his advice.

The presence of national-level nuclear science, top-level West Coast universities with growing interest in computers, and IBM’s arrival in the Bay Area, could have given von Neumann another chance on pioneer computer research – as an alternative to the more active, military-funded industrial computer activities in the Los Angeles region where von Neumann also had his RAND and JOHNNIAC.

21 years later views about von Neumann’s computer design began to change, and it was an IBM San Jose Research Laboratory scientist, John Backus quoted earlier, who put forth the term, “the von Neumann bottleneck”, in his 1977 Turing Award lecture which made references to von Neumann a whopping over 90 times – love him or hate him!

(John Backus, 19 77 ACM Turing Award Lecture, Association for Computing Machinery)

Within the academia, the termination of the Institute for Numerical Analysis at UCLA in 1954, when the National Bureau of Standards gave up its management role due to the Pentagon’s objection, was a watershed event in the history of the computing field.

Harry Huskey, the SWAC computer project leader and computer training leader at INA who subsequently moved to Berkeley, later blamed the INA’s end on McCarthyism targeting the NBS:

“… some company made an additive to add to batteries that was supposed to extend their life, and the Bureau of Standards was given the job of testing it. So they tested it and decided that it didn’t do any good at all, and reported this. The guy that manufactured it contacted his congressman and said whatever, and that ended up with the Commerce Department appointing a committee, the Kelly Committee, I think it was, to review what the Bureau of Standards was doing, and this is also tied up with McCarthy. McCarthy was witch-hunting, you know, and I think the– well, they’re almost independent, but anyway, the McCarthy business caused the Bureau to fire a number of people, starting at the top. Ed Condon was fired. The next director, Alan Astin I think, was forced to resign or fired, or something. In the math division, John Curtiss was fired.

The whole Bureau operated with a good fraction of its budget coming from projects that were financed by other government agencies, and almost all of that was wiped out. If the Navy had a project going on, they would transfer it back to the Navy, and that sort of thing, so there was a real cutback in operation.

The fact that INA was a project under the Bureau of Standards caused it to be terminated as a Bureau project. …”

(interview by William Aspray, February 7, 2006, Computer History Museum)

I wouldn’t be surprised if the INA’s demise had to do with McCarthyism, given that in the summer of 1954 after its closure, John Nash was arrested for public homosexual activity in nearby Santa Monica and expelled from RAND.

On the other hand, from an opposite viewpoint, the end of NBS’s broad management role in scientific research may have reflected a Pentagon objective to get the ‘bang for their buck’, i.e., to focus funding on research directly relevant to the U.S. military.

Historically in the United States, substantial government support for scientific research had begun only with the coming of World War II:

“… During the Great Depression … a Science Advisory Board was created by executive order to advise the President. However, the board’s attempts to establish a basic research program in universities did not succeed.

The most significant step toward a durable relationship between government and science came in 1940. The war raging in Europe presented an opportunity for scientific work to affect a conflict. The leaders of the scientific community began to lobby for the creation of a government agency that would mobilize U.S. scientists for the country’s inevitable entry into the war. As a result, President Roosevelt created the National Defense Research Committee (NDRC) under the chairmanship of Dr. Vannevar Bush. Bush was a former Dean of Engineering at MIT and was later the president of the Carnegie Institution in Washington. …

A major landmark in the progress of governmental support for science in the United States turned out to be the creation of the expanded Office of Scientific Research and Development (OSRD), under Vannevar Bush. This initiated a structure under which U.S. scientists were brought into war efforts through a contract mechanism, while leaving them free to pursue their creative work. …”

(Jagdish Chandra and Stephen M. Robinson, An Uneasy Alliance: The Mathematics Research Center At the University of Wisconsin, 1956-1987, 2005, Society for Industrial and Applied Mathematics)

As in the above history, the U.S. government’s scientific research funding came primarily from World War II preparation and in the form of contracts, which did not prohibit the scientists from pursuing other scientific and creative interests: mobilization of the scientific community was led by the U.S. government’s National Defense Research Committee (NDRC) headed by former MIT Dean of Engineering Vannevar Bush, and later the expanded Office of Scientific Research and Development (OSRD) under Bush became a contracting agency for wartime scientific research.

The ENIAC discussed earlier was a prominent example of military research and development by academic scientists: the development of the first general-purpose electronic computer was directly initiated, funded and supervised by the Army, but was carried out at a university; after its completion, the leading developers were free to move on to start their own company, or build computers in the academia.

After World War II, the Navy’s Office of Naval Research became the primary science funding agency before the founding of the National Science Foundation – with the exceptions of medical research funded by the National Institutes of Health, and nuclear science research funded by the Atomic Energy Commission:

“In 1946, the Office of Naval Research (ONR) was created to plan, foster, and encourage scientific research and to provide within the Department of the Navy a single office which by contract or otherwise was able to sponsor, obtain, and coordinate innovative research of general interest to all sectors of the Navy. … By and large, the naval authorities believed that most of the basic research carried out under ONR’s auspices should be published in the normal way. This policy allayed many fears in the academic and scientific community. The office began to take on the role that was envisaged for the yet-to-be-established National Science Foundation (NSF).

The National Institute of Health (NIH), established in 1930 and generously funded by OSRD during the war, became a major focus of government support for medical research in the universities. The Atomic Energy Commission (AEC) was established in 1946, and this agency forged close links with universities by contracting research work to them and by building up the university-associated laboratories that it had inherited from the Manhattan Project. …”

(Jagdish Chandra and Stephen M. Robinson, 2005, Society for Industrial and Applied Mathematics)

Despite the original recommendation by Vannevar Bush, the U.S. government’s leading science adviser, the NSF founded in 1950 did not include defense research in its charter; and the Army and Air Force proceeded to establish their own research agencies:

“When the NSF was eventually established in 1950, defense research was excluded from its terms of reference. In the initial recommendation, Dr. Bush had envisioned defense research as one of the organizational component of NSF’s charter. As a consequence, the Department of the Army, and subsequently the Air Force, established their own offices of research. The Department of the Army’s Office of Ordinance Research was established in June 1951 on the campus of Duke University.”

(Jagdish Chandra and Stephen M. Robinson, 2005, Society for Industrial and Applied Mathematics)

As quoted above, Vannevar Bush, former MIT Dean of Engineering , had envisioned the National Science Foundation to have defense research as an organizational component.

Bush had outlined his vision in a July 1945 report to President Harry Truman, in which the proposed “National Research Foundation” would include a “Division of National Defense” alongside other divisions such as a “Division of Medical Research” and a “Division of Natural Sciences”.

(“Science The Endless Frontier: A Report to the President by Vannevar Bush, Director of the Office of Scientific Research and Development, July 1945”, National Science Foundation)

The actual outcome, i.e., an NSF without a defense research branch, was positive for someone like Stephen Smale, who in the 1960s faced the unpleasant prospect, as in Part 2, that his anti-war activism risked his NSF grant eligibility – fortunately the NSF did not need to defer to the Pentagon.

From this angle, the National Bureau of Standards’ loss of management role for defense-funded research projects was inline with the separation of the NSF and defense research, although some might reason that when it came to the administration of national standards for technology there should be as few exceptions as possible.

But as pointed out by Harry Huskey, quoted earlier, in 1953 the NBS’s loss came as a result of McCarthyism-type politics. There was a public scandal, Congressional hearings and the firing of key NBS leaders; at the recommendation of the Congressional Kelly Committee, the Pentagon transferred all weaponry research away from the NBS:

“… The Battery AD-X2 controversy, on the other hand, was serious indeed. It caused the firing of the Bureau’s director, followed eventually by full reinstatement; prompted the investigation of the Bureau by two high-level committees and brought about dramatic changes in its programs; provoked a furor in the whole scientific community and led a large number of the Bureau staff to threaten resignation; resulted in six days of hearings before a Senate select committee; made the Bureau and its director front-page news for months; brought about the resignation of an assistant secretary of commerce; and (in part) caused the transfer of 2000 persons from the Bureau to newly formed military laboratories.

It can be fairly said that no other single report has had as great an effect on the history of the Bureau as the “Kelly Committee Report,” as it is commonly known. …

… Hence it recommended the “transfer of weaponry projects to the Department of Defense,” but recommended “continued use of the Bureau by Department of Defense and Atomic Energy Commission for non-weaponry science and technical aid.” Following these recommendations, on September 27, 1953, four ordnance divisions, totaling 2000 persons—1600 in three divisions at the Harry Diamond Ordnance Laboratory in Washington, and 400 at the Missile Development Division in Corona, California—were transferred to Army Ordnance and Naval Ordnance respectively, although all operations remained at their respective sites. …”

(Elio Passaglia with Karma A. Beal, A Unique Institution: The National Bureau of Standards, 1950-1969, 1999, National Institute of Standards and Technology, U.S. Department of Commerce)

As the Kelly Committee stated, quoted above, that research in “non-weaponry science and technical aid” for the Department of Defense could continue within the NBS.

Obviously, most of the mathematical research and computer training at the Institute for Numerical Analysis, funded by the Office of Naval Research as mentioned earlier, was “non-weaponry science” and so should have been able to continue. But as quoted earlier, Secretary of Defense Charles Wilson decided to end NBS management of all defense agency-funded projects, including the INA.

The U.S. Army understood the importance of academic scientific research, as seen in the fact that its Office of Ordinance Research was first founded on the campus of Duke University, following the establishment of the NSF independent of the Pentagon, as quoted earlier.

The end of the INA became a point in time following which the Army directly went into initiating and supervising university-based mathematical research.

Led by Lieutenant General James M. Gavin, Lieutenant General Arthur Trudeau and Brigadier General Chester Clark, the Army proceeded with forming its own mathematics research center in the academia, with the focus on relevance to the interests of the Army:

“Army general officers such as Lt. Gen. Arthur Trudeau, Lt. Gen. James M. Gavin, and Brig. Gen. Chester Clark, and other officers such as Lt. Col. Ivan R. Hershner, recognized early in the 1950s that the Army is a major user of the fruits of research in mathematics, no matter what the source is. … these enlightened officers and other members of the Army establishment were successful in convincing the Army to establish a center of mathematical expertise at an academic institution.

In preparation for this crucial decision, the Mathematics Advisory Panel of the Army, a precursor group to the Army Mathematics Advisory Group (AMAG) and the Army Mathematics Steering Committee (AMSC), conducted a survey of the uses of mathematics in Army activities and combined that with a census of its mathematically trained personnel and its expenditures for mathematical investigation. …

… The Advisory Panel made two recommendations. First, it advised that the Army establish for itself a mathematics research center at an academic institution. The key aspects of the work statement were to conduct basic research in selected areas of mathematics relevant to the interests of the Army, to provide educational and training courses to the Army on current mathematical methods, and to be available for consulting on mathematical problems encountered by Army scientists and engineers. It was to carry on research in four areas…:

  • Numerical analysis. This was broadly understood as the adaptation of mathematics to high-speed computation, to include the use of electronic computing machines, the formulation of mathematical problems for exploration by such computers, and hence the broadening of the field in which such computers could be used. This area was originally intended to include “the engineering physics of high-speed computers,” presumably what is now referred to as computer architecture and computer engineering, though unfortunately very little was in fact done at MRC in those areas.
  • Statistics and the theory of probability.
  • Applied mathematics, including ordinary and partial differential equations as well as physical mathematics with emphasis on fluid mechanics, elasticity, plasticity, electro-dynamics, electrical networks, wave guidance, and propagation. 
  • Operations research, including such subfields as linear and nonlinear programming, game theory, decision theory, information theory, and optimization.

Second, the Advisory Panel recommended that it be recognized and established as a continuing body, with the assignment to inform itself about new mathematical developments and to keep itself informed of the Army’s needs in and uses of mathematics, to supervise activities of this kind, and to facilitate the interchange of relevant information between activities. Initially, this was a committee of about twenty-five, including four from academic institutions. The rest represented various Army activities.”

(Jagdish Chandra and Stephen M. Robinson, 2005, Society for Industrial and Applied Mathematics)

In the above history account, the reflection of the military interests in the founding of an Army mathematics research center can be seen in the Army experts’ overwhelming dominance on the advisory panel over the academics.

Army experts led by Lieutenant Colonel Ivan R. Hershner, the University of Vermont’s mathematics department chairman, visited 26 universities that showed some interest, including, “Brown, Columbia, University of Chicago, Duke, California Institute of Technology, Harvard, the University of Illinois, the University of Michigan, MIT, New York University, the University of North Carolina, UCLA, UC-Berkeley, Stanford, the University of Wisconsin, and the University of Virginia”; out of 21 university proposals submitted, the University of Wisconsin was chosen:

“Towards the realization of the first recommendation, the chief of research and development of the Army appointed Ivan R. Hershner (then the chair of the Mathematics Department at the University of Vermont) to head an effort to explore with various universities and research groups their possible interest in this center. Letters were sent to over fifty U.S. institutions of higher learning. Based on the level of interest expressed, this small group of experts visited twenty-six universities …

This process resulted in twenty-one formal proposals. A technical advisory committee of Army scientists evaluated these proposals … The Army had offered to provide a state-of-the-art computer, but it expected that the selected university would supply suitable physical space to house the center. …

The decision to establish the Mathematics Research Center at the University of Wisconsin was announced on November 16, 1955, by Lt. Gen. James M. Gavin, chief of research and development of the U.S. Army. …

The first contract for MRC’s operation was signed on April 25, 1956, and the university designated Professor Rudolph E. Langer as MRC’s first director. ….”

(Jagdish Chandra and Stephen M. Robinson, 2005, Society for Industrial and Applied Mathematics)

The Army Mathematics Research Center started in 1956, the same year IBM started its San Jose research lab.

It is interesting that following a nationwide search, the Army’s final choice of the academic host for its mathematics research center was the leading university in the home state of then Senator Joseph McCarthy.

That might be coincidental, but it wasn’t isolated. The Army’s Office of Ordinance Research had been founded in 1951, as quoted earlier, at Duke University, which happened to be the alma mater of the high-profile, staunchly anti-Communist Senator Richard Nixon – soon-to-be U.S. Vice President – from Southern California, whose political tie to North Carolina was intimate, even within the Senate, as recalled by future Senator Jesse Helms, then an assistant to Senator Willis Smith of North Carolina:

“Serving as administrative assistant to a United States Senator is a true learning experience. …

In 1951 there were ninety-six U.S. Senators representing the then forty-eight states…

One of those ninety-six Senators back then was a delightful young Republican Senator from California named Richard M. Nixon. I was impressed by his intellect and his genuine interest in working with people who shared conservative principles without concern for their party tag. Senator Nixon had a solid North Carolina connection because he was a graduate of the law school at Duke University. As I mentioned, Senator Smith had been on the university’s board of trustees for some time … There were many visits to Senator Smith’s office by then President of Duke, Arthur Hollis Edens, and Senator Nixon often stopped by to greet Dr. Edens. The Duke connection as fellow alumni helped establish a solid friendship between Senator Nixon and Senator Smith.

The assignment of office space had put Mr. Nixon’s offices between the offices of Senator Smith and North Carolina’s senior Senator, Clyde R. Hoey, on one corner of the third floor of the Russell Senate Office Building. …”

(Jesse Helms, Here’s where I Stand: A Memoir, 2005, Random House)

As illustrated, much thought had been given to the assignment of office locations in a Congressional building – let alone the location of an Army central research center.

The Army Mathematics Research Center at the University of Wisconsin-Madison began its life less than 2 years after the termination of the Institute for Numerical Analysis at UCLA, and in the same year 1956 when John von Neumann, former president of the American Mathematical Society and a top adviser to the U.S. military, was hospitalized for cancer treatment and made the decision to move to the University of California.

Von Neumann soon died, in February 1957 at the age of 53.

Shortly afterwards in May 1957, McCarthy suddenly died at only 48.

Prior to that, in the early summer of 1954 – just as the INA was closing – McCarthy’s ongoing Senate committee hearings hunting for Communists in the U.S. government were foiled by the Army, after he tried to target former Army General Dwight D. Eisenhower:

“…Often, the information McCarthy used came from FBI files, which were full of rumor and third-hand accounts.

The McCarthy era began on February 9, 1950 when the obscure Republican senator from Wisconsin gave a speech to 275 members of the local Republican women’s club at the McClure Hotel in Wheeling, West Virginia.

“While I cannot take the time to name all the men in the State Department who have been named as members of the Communist Party and members of a spy ring, I have here in my hand a list of 205—a list of names that were known to the secretary of State and who, nevertheless, are still working and shaping policy of the State Department,” McCarthy said…

McCarthy eventually made the mistake of turning his sights on President Dwight D. Eisenhower. A former Army general who had led allied forces to victory during World War II, Eisenhower was as American as apple pie.

As McCarthy began accusing Eisenhower of being soft on Communists, Hoover realized he would have to distance himself from the senator. Just before what became known as the Army-McCarthy hearings started on April 22, 1954, Hoover ordered the bureau to cease helping him. …

During the hearings, McCarthy failed to substantiate his claims that the Communists had penetrated the Army, which had hired a shrewd Boston lawyer, Joseph Welch, to represent it. McCarthy noted that Fred Fischer, a young lawyer in Welch’s firm, had been a member while at Harvard Law School of the National Lawyers Guild, described by the attorney general as the “legal mouthpiece of the Communist Party.” Supreme Court Justice Arthur J. Goldberg had also been a member of the group.

Upon hearing this accusation, Welch responded, “Until this moment, senator, I think I never really gauged your cruelty or recklessness.” When McCarthy continued to hound Fischer, Welch said, “Have you no sense of decency, sir, at long last? Have you left no sense of decency?”

After two months, the hearings were over, and so was McCarthy’s career. Watching the hearings on television, millions of Americans had seen how he bullied witnesses and what an unsavory character he was. Behind the scenes, Eisenhower pushed fellow Republicans to censure McCarthy.

In August 1954, a Senate committee was formed to investigate the senator. …

On December 2, 1954, the Senate voted 67 to 22 to censure him. After that, when he rose to speak, senators left the Senate chamber. Reporters no longer attended his press conferences. On May 2, 1957, McCarthy died at the age of forty-eight of acute hepatitis, widely believed to be a result of his alcoholism…”

(“The Real Story on Joe McCarthy”, by Ronald Kessler, April 7, 2008, Newsmax)

Under the Army’s supervision the Mathematics Research Center at Wisconsin-Madison excelled. A clear sign that the MRC viewed itself as inheriting the mantle of the INA at UCLA was the fact that J. Barkley Rosser, an early director of the INA, became the second director of the MRC in 1963:

“In 1949 he was asked to become the Director of Research at a newly created Institute for Numerical Analysis, located at UCLA and sponsored by the National Bureau of Standards. At this early stage in modern electronic computing, Rosser was successful in drawing together a stellar group of mathematicians whose ultimate impact on the future of computing was memorable. He also saw that the computer held great promise for pure mathematics; one example was a project aimed at finding high precision values for the zeros of the Riemann zeta-function. While the final publication was delayed until 1969, this was among the earliest computational evidence supporting a famous conjecture of Riemann connected with properties of the prime numbers.

With the Institute functioning, Rosser returned to Cornell. In 1953-54 he received a joint Guggenheim-Fulbright fellowship which he spent in Europe, writing a book on modern logic. However, because able scientific administrators are rare, he also continued to receive requests to fill such posts, serving on many panels and committees connected with the Space Program and related projects, as well as other scientific organizations and research centers. Among these: Director of the Institute for Defense Analysis, Chairman of the Mathematics Division of the NRC, and Chairman of the Conference Board of the Mathematical Sciences.

In 1963 he moved permanently from Cornell to Wisconsin, to become the Director of the Mathematical Research Center, replacing the first Director, Rudolph Langer, who had chosen to retire. The presence of two longtime Princeton friends, Joe Hirshfelder and Steve Kleene, was an added incentive for Rosser. The MRC operated under a contract from the Department of the Army…”

(“Memorial Resolution of the Faculty of the University of Wisconsin-Madison: On the Death of Emeritus Professor J. Barkley Rosser”, March 5, 1990, University of Wisconsin Madison)

From INA directorship in 1949 to directorship of the Institute for Defense Analysis, chairmanship of the National Research Council’s mathematics division, and then directorship of the Army mathematics research center, the mathematician J. Barkley Rosser took on several important management positions affiliated with the U.S. government and the defense establishment. As a result, as told in the above quote, some of his own mathematical research did not get to publication for 2 decades until 1969.

Interestingly, that particular research piece of Rosser’s was the use of the computer at INA, the SWAC computer as mentioned earlier, to calculate the zeros of the Riemann zeta-function, i.e., to provide evidence for the Riemann Hypothesis – a famous pure mathematics problem which John Nash’s unsuccessful attempt at solving in 1958 contributed to his mental instability, as in Part 2.

The rite of manhood in Professor Rosser’s occupation, I suspect, be it mathematics applied to the Army’s interests or mathematics as difficult as the Riemann Hypothesis.

The year after Rosser’s publication of his computing work on the Riemann Hypothesis, i.e., in 1970 as in Part 4, the MRC under his directorship was the target of the most powerful U.S. domestic terror bombing up to that point, which killed a physicist, Robert Fassnacht.

The bombing was a part of anti-war protests persisting over the years against the Army-affiliated math center, with some protestors advocating for “A People’s Math Research Center”:

“During the years of protest against the war and against MRC, many persons wrote documents, pro or con, about the center’s activities in support of the Army. Among all of these, the one that stands out as probably the most comprehensive single presentation of the case against the center is a booklet called THE AMRC Papers, produced in 1973 by a group calling itself the Madison Collective of Science for the People. …”

The Booklet is organized in four parts, whose titles are

  • How AMRC Helps the Army
  • How AMRC Works
  • AMRC’s Relationship with the University of Wisconsin
  • An Alternative: A People’s Math Research Center

The part of most interest here is the first, which includes four chapters on specific areas in which it is alleged that MRC helped the Army. The titles of these chapters are Counterinsurgency, Chemical & Biological Warfare, Missile, and Conventional Weapons. … Indeed, many of the descriptions reported in these four chapters are taken directly from the reports of the center itself, and others from documents produced by military agencies. … this booklet was being sold in Madison at a time when some Army scientists responsible for oversight of the MRC contract were in town. Mindful of the difficulty they frequently encountered in persuading other Army officials that mathematical research was doing anything of real use to the Army, the scientists went out on the street and bought 40 copies of the booklet because it made such powerful arguments that MRC was in fact of great benefit in advancing the Army’s programs!”

(Jagdish Chandra and Stephen M. Robinson, 2005, Society for Industrial and Applied Mathematics)

Mina Rees, a mathematician who had held management positions in the U.S. government research agencies, including with the applied mathematics panel of the National Defense Research Committee and as the head of the mathematics division at the Office of Naval Research, and who had played a key role in starting the NBS-sponsored INA at UCLA, expressed her strong opposition to the Army’s plan of directly involvement in an academic math research center:

“I think now rather with amusement of our feeling that the West Coast was somewhat underprivileged in — in this kind of development, but we did go to major universities all over the United States and it was after the visits to various places and an assessment of the degree of interest and the degree of involvement that the various universities were willing to undertake that we decided that the University of California at Los Angeles had the best chance of doing a – the kind of thing that we saw as needed, and I would think that we spent at least a year making that decision.

Yes. I think that was an outcome of discussions between John Curtiss and me, and one reason that we chose Southern California was that we thought that that was the place where we could get people to do that. Now the – what is it called – the Institute at Wisconsin – the Army Research – Mathematics Research Institute which had its troubles during the students’ uprisings some years later built on the same concept and tried to exploit the same attractiveness at having Army work done in a university. I was strongly opposed to that at that time, and I had no foresight- I don’t claim any foresight of what was going to happen later – but it just did not seem to me the right way to go about that problem, but it did seem to be the right: way to go about the development of solid mathematics.”

(“Interviewee: Mina Rees (1902-1997) Interviewer: Henry Tropp”, September 14, 1972, Computer Oral History Collection, 1969-1973, 1977, Smithsonian National Museum of American History)

There was “solid mathematics” done at the Army MRC at Wisconsin-Madison despite her strong opposition to the setup, as Rees later admitted in the above.

Moreover, the solid mathematics did not apply only to the military’s interests, but also in civilian industry.

Recall as previously quoted in Part 4, the significant achievements of Wisconsin-Madison mathematics professor Carl der Boor – SIAM’s 1996 John von Neumann Lecturer as cited earlier – in the development of the theory and applications of spline functions, which became “indispensible tools” in computer-aided design, and in auto and airplane manufacturing, among other industrial fields:

“… Splines were introduced in the 40’s (by the late I.J. Schoenberg of Wisconsin) as a means for approximating discrete data by curves. Their practical application was delayed almost twenty years until computers became powerful enough to handle the requisite computations. Since then they have become indispensible tools in computer-aided design and manufacture (cars and airplanes, in particular), in the production of printer’s typesets, in automated cartography… Carl is the worldwide leader and authority in the theory and applications of spline functions. … Carl has made Wisconsin-Madison a major international center in approximation theory and numerical analysis…”

(“Van Vleck Notes: Dedications, Honors and Awards …”, Fall 1997, Department of Mathematics, University of Wisconsin)

I. J. Schoenberg mentioned above, the founder of the mathematical theory of spline functions, had done some of his early work at the INA amidst a host of other researchers, including Derrick Lehmer, J. Barkley Rosser and David Saxon mentioned earlier, pursuing various subjects of their interests:

“THE PERIOD SUMMER 1951 THROUGH SPRING 1952

Research in the Mathematical Theory of Program Planning was carried enthusiastically by Motzkin, Agmon, Blumenthal, Gaddum, Schoenberg, and Walsh. During July and August a joint seminar was held with Rand on “Linear inequalities and related topics.” Invited speakers from outside were: A. W. Tucker, R. W. Shepherd, J. M. Danskin, S. Karlin, and R. E. Bellman.

Studies in numerical integration of ordinary and partial differential equations were pursued vigorously by Agmon, Bers, Fichera, and Wasow. … Rosser investigated the problem of computing low moments of normal order statistics. …

… Schoenberg pursued his theory of splines, a theory that has many useful applications.

Lehmer developed a practical method for obtaining the so-called Kloosterman Sums and investigated their properties. A series of tests for primality of Mersenne numbers were made on the SWAC, using a code sent in by R. M. Robinson of UC-Berkeley. …

Studies in theoretical physics were carried out by Saxon in cooperation with members of the Physics Department and other departments at UCLA …”

(Magnus R. Hestenes and John Todd, 1991, National Institute of Standards and Technology, U.S. Department of Commerce)

But it wasn’t until the mid-1960s at the MRC at Wisconsin-Madison that research in spline functions theory, “a theory that has many useful applications” as quoted above, flourished:

“Work on splines at MRC started in 1965 under the leadership of two permanent members, I. J. Schoenberg and T. N. E. Greville. The work evolved into a separate area in 1966 and continued for years thereafter. In fact, it probably is the case that spline functions are one of the best recognized of the mathematical advances that MRC brought about. …

… The contrast between the sustained success of the spline function subarea (benefiting from the continuous attention and organizational work of Schoenberg and later of Carl de Boor) and the sporadic nature of the other numerical analysis work provides a striking example of the importance of influential continuing staff in the development and sustenance of a research area.”

(Jagdish Chandra and Stephen M. Robinson, 2005, Society for Industrial and Applied Mathematics)

As quoted, spline functions became one of MRC’s “best recognized” successful research advances, whereas other numerical analysis work was “sporadic” in nature.

The direct funding, by U.S. defense research agencies, of mathematical research applicable to their interests continued to and during the 1980s, as can be seen in my situation when I was applying for graduate study in the U.S. and then studying for my mathematics Ph.D. at Berkeley, here as summarized from previous discussions in Part 4:

  • in 1982 graduating from Sun Yat-sen University in China, Prof. Yuesheng Li who had supervised my undergraduate thesis in spline functions theory, suggested that I go to the MRC at Wisconsin-Madison to study with Carl de Boor, whose industry-applied research had been funded by the U.S. Army;
  • when I chose UC Berkeley, Prof. Li suggested that I study with Alexander Chorin, whose ground-breaking research in computational fluid dynamics had been funded by the U.S. Navy;
  • partly at the advice of Tosio Kato at Berkeley, I chose Stephen Smale, a prominent pure mathematician and former anti-war movement leader, to be my Ph.D. adviser, whose research had been funded by the National Science Foundation;
  • Smale’s ambitious work to develop mathematical theories for numerical analysis was consistently dismissed by Berkeley numerical analysts, especially by Chorin, and Smale’s claims of his work being in applied mathematics were not accepted by those aligned with the numerical analysts.

From an industry point of view, the dominance of military influences in the early development of computers could be partly due to the ineptitude, or ineffectiveness, of the civilian sector, as seen in Berkeley Ph.D. and Silicon Valley pioneer Douglas Engelbart’s experience in the mid-1950s with Hewlett-Packard, discussed earlier.

IBM, which in 1956 started a research laboratory in San Jose as discussed earlier, hadn’t done that well, either:

“… IBM’s president from 1914 to 1956, Thomas J. Watson, Sr., had failed to recognize growing scientific and engineering demand for high-speed computing and visualized only a small market for the new electronic machines. Only under the patriotic cover of IBM’s support for the Korean War effort and through the leadership of Thomas J. Watson, Jr., did the firm manufacture its first computer, the Defense Calculator—IBM Model 701. The eighteen machines produced were oriented toward scientific use, with limited input/output equipment, and were all placed at government installations or with defense contractors. …”

(David O. Whitten and Bessie E. Whitten, eds., Manufacturing: A Historiographical and Bibliographical Guide, 1990, Greenwood Press)

Like with the invention of the first electronic computer ENIAC, war mobilization played a key role in the start of IBM computer manufacturing, during the Korean War era – despite the International Business Machines Corporation’s decades-long history in a closely related industrial field.

It is also interesting that the IBM San Jose research lab’s start coincided with the end of the over 4-decades-long reign of Thomas J. Watson, Sr. at IBM, in 1956 as quoted above.

Watson, who had adopted for IBM the alluring slogan, “World peace through world trade”, passed the reign to his son Thomas J. Watson, Jr., a month before his death in June 1956.

(“Thomas J. Watson: CEO 1914 – 1956”, International Business Machines Corporation)

Another industrial company was more eager than IBM.

I have quoted in Part 4 from an 2011 blog post, about Prof. Li in 1982 stressing to me the benefits of Carl de Boor’s General Motors connection:

“When I applied for graduate study in the United States Professor Li seriously recommended the U. S. Army Mathematics Research Center at the University of Wisconsin, Madison – Dr. Carl de Boor there and his General Motors connection were Professor Li’s favorite …”

(“Team Canada female athletes disqualified from Commonwealth silver medal, jailed Chinese democracy activist awarded with Nobel peace prize, and others in between (Part 3) – when violence and motive are subtle and pervasive”, March 29, 2011, Feng Gao’s Blog – Reflections on Events of Interest)

I had no knowledge of the specifics of Professor de Boor’s GM connection.

But there is something about Charles Wilson, President Eisenhower’s Secretary of Defense who in 1953 dumped the National Bureau of Standards from the management of defense agency-funded projects.

Wilson had been promoted from the presidency of General Motors:

“Running on an anti-New Deal, pro-business, anti-corruption, anti-Communism platform, and featuring a pledge to end the Korean conflict, the Republican Eisenhower-Nixon ticket rode roughshod over the Stevenson-Sparkman Democrats, winning the White House as well as both houses of Congress. A changed philosophy of Government had been installed in Washington, one best exemplified by the nomination as secretary of defense of Charles (“Engine Charlie”) Wilson, president of General Motors, whose statement, “What’s good for the country is good for General Motors and vice versa,” was added to the lexicon of the Nation’s political history.”

(Elio Passaglia with Karma A. Beal, 1999, National Institute of Standards and Technology, U.S. Department of Commerce)

Yup, what was good for the Army’s interests was probably good for General Motors, and “Charlie Engine” Wilson had more of that drive than Thomas Watson, Sr.

By 1981-1982 as I was applying to U.S. graduate schools and had discussions with Prof. Li, there was a General Motors senior executive with a prominent mathematical computing link in the family.

Marina von Neumann Whitman, General Motors vice president and chief economist beginning in 1979, was the daughter of the late “father of computers” who had spread his computer-building ‘gospel’ around the academia and scientific institutions; she had been the first woman ever to be on the White House Council of Economic Advisers, appointed by President Richard Nixon, Eisenhower’s former vice president, in 1972:

“Whitman’s father, John von Neumann, is known for inventing Game Theory, pioneering developments in computer science and contributing to the Manhattan Project, among other achievements.

“This was a force to contend with,” Whitman said. “He was a wonderful father, but he put a lot of pressure on me to always be on the top of everything.”

Still, it’s safe to say she’s escaped her father’s shadow. She was the first woman to be appointed to the president’s Council of Economic Advisers in 1972, by President Richard Nixon. Whitman also served as vice president and chief economist of General Motors from 1979 to 1985 and group vice president for public affairs from 1985 to 1992.”

(“Marina von Neumann Whitman to read from new memoir ‘The Martian’s Daughter’”, by John Bohn, October 2, 2012, The Michigan Daily)

General Motors’ recognition of von Neumann Whitman’s talents was only logical, considering that in the 1950s Secretary of Defense Charles Wilson, the former GM president, had benefited greatly from her father’s advice, even at his hospital bedside in his last year of life:

“… At Walter Reed, where he was moved early last spring, an Air Force officer, Lieut. Colonel Vincent Ford, worked full time assisting him. Eight airmen, all cleared for top secret material, were assigned to help on a 24-hour basis. His work for the Air Force and other government departments continued. Cabinet members and military officials continually came for his advice, and on one occasion Secretary of Defense Charles Wilson, Air Force Secretary Donald Quarles and most of the top Air Force brass gathered in Von Neumann’s suite to consult his judgment while there was still time. …”

(Clay Blair, Jr., February 25, 1957, Volume 42, Number 8, Life)

Ironically, the earlier experiences of Marina von Neumann, entering the real world, included being turned down for a job prospect at IBM, where her late father had been a consultant, and being invited to apply and then rejected for Ph.D. study at Princeton University, where her father had been famous – for rather unusual personal reasons:

“She remembers one contentious exchange after Whitman told her father that she planned to get married upon graduating college.

“He had a fit,” Whitman said. “He thought that this would be the death knell for any professional ambitions I might have. And in the 1950s, he was statistically right, but he was wrong about me.”

Using two distinct anecdotes, Whitman’s second focus in the book is how society has changed during her lifetime. In the first, she describes how she was turned down for a prospective job opportunity at IBM because the recruiter saw she was engaged to be married.

The second anecdote discusses Whitman’s application to Princeton University for a Ph.D. in economics; the economics department invited her to apply, yet turned her down for a simple reason.

“I went to see the president (of Princeton). And what the conversation boiled down to was, ‘I’m so sorry, Mrs. Whitman, we can accept a student of your caliber, but we just don’t have enough ladies’ rooms.’ ”

(John Bohn, October 2, 2012, The Michigan Daily)

Marina Whitman is now a professor of business administration and public policy at the Gerald R. Ford School of Public Policy, the University of Michigan, Ann Arbor.

(““The Martian’s Daughter” by Marina von Neumann Whitman”, October 2, 2012, Gerald R. Ford School of Public Policy, University of Michigan)

(Part 5 continues in (ii))

Leave a comment

Filed under Academia, Computer, Computing, Health, History, Industry, Politics, Science, War and peace

Leave a comment