A review of postings about scientific integrity and intellectual honesty, with observations regarding elite centrism – Part 5: inventions, innovations, and ushering of ‘the new normal’ (iii)

(Continued from Part 5 (ii))

When the Harvard University computer pioneer Howard Aiken, inventor of Harvard Mark I, the world’s first “automatic digital calculator” built in the early 1940s by IBM, took early retirement in 1961 from his professorship, intending to become a businessman in the industry, he and Cuthbert Hurd, director of IBM’s Electronic Data Processing Machines Division, likely had discussions about starting “the first microcomputer computer company in the world”, with design work already being done by an assistant director of engineering at Lockheed Missiles and Space Company in Sunnyvale, California, where Aiken was a regular consultant; but as in Part 5 (ii), their collaboration plan did not materialize and Aiken founded his own Howard Aiken Industries Incorporated in New York and Florida, specializing in buying ailing companies, fixing and then selling them, while Hurd left IBM in 1962 to become board chairman of the first independent computer software company, the Computer Usage Company.

As reviewed in Part 5 (ii), Aiken had been the leading academic computer-pioneer rival to John von Neumann, but not attaining the level of prominence of von Neumann who has been regarded as the “father of computers” for his role in the development of the first general-purpose electronic computer ENIAC built in the mid-1940s at the University of Pennsylvania, his advocacy of the “stored program” computer design, and his subsequent leadership of an ambitious computer-building movement among the academia and scientific institutions.

From this perspective, as I have commented in Part 5 (ii), in 1961-1962 Aiken missed a second chance to attain some sort of “father” status of a new generation of computers, namely microcomputers, and the chance perhaps to share some glories of founding Silicon Valley with distinguished figures such as Stanford University engineering dean and provost Frederick Terman who had in the 1930s mentored the founders of the Hewlett-Packard Company, and such as 8 young scientists and engineers who had in 1957 rebelled against the difficult management style of their mentor, 1956 Nobel Physics Prize laureate William Shockley, and founded the Fairchild Semiconductor Corporation.

The 1939 founding of Hewlett-Packard by Stanford University graduates William Hewlett and David Packard in a garage in Palo Alto has since been recognized as the birth of the Silicon Valley, as noted in Part 5 (ii).

A co-winner of the 1956 Nobel Physic Prize for his role in inventing the transistor, William Shockley had in late 1955 started the Shockley Semiconductor Laboratory in Mountain View near Stanford, marking the arrival of the semiconductor industry that would give Silicon Valley its name; but it was the 1957 rebellion of the 8 disciples at the Shockley Semiconductor Lab, Robert Noyce, Gordon Moore, Julius Bank, Victor Grinich, Jean Hoerni, Eugene Kleiner, Jay Last and Sheldon Roberts, and their founding of Fairchild Semiconductor that led to that industry’s success, as previously quoted in Part 5 (ii):

“In September 1955 William Shockley and Arnold Beckman agreed to found the Shockley Semiconductor Laboratory as a Division of Beckman Instruments … Shockley rented a building … in Mountain View… attracted extremely capable engineers and scientists, including Gordon Moore and Robert Noyce, Julius Blank, who learned about and developed technologies and processes related to silicon and diffusion while working there. In December 1956 Shockley shared the Nobel Prize in Physics for inventing the transistor, but his staff was becoming disenchanted with his difficult management style. They also felt the company should pursue more immediate opportunities for producing silicon transistors rather than the distant promise of a challenging four-layer p-n-p-n diode he had conceived at Bell Labs for telephone switching applications.

After unsuccessfully asking Beckman to hire a new manager, eight Shockley employees – including Moore and Noyce plus Julius Blank, Victor Grinich, Jean Hoerni, Eugene Kleiner, Jay Last and Sheldon Roberts – resigned in September 1957 and founded the Fairchild Semiconductor Corporation in Palo Alto. Many other employees, from technicians to PhDs, soon followed. Over the next decade, Fairchild grew into of the most important and innovative companies in the semiconductor industry, laying the technological and cultural foundations of Silicon Valley while spinning off dozens of new high-tech start-ups, including Advanced Micro Devices (AMD) and Intel. …”

(“1956: Silicon Comes to Silicon Valley”, The Silicon Engine, Computer History Museum)

As quoted, Shockley’s 8 disciples had become “disenchanted with his difficult management style” and also wanted “more immediate opportunities for producing silicon transistors”.

In 1956 von Neumann, by this time a U.S. Atomic Energy Commissioner and a leading scientific adviser to the U.S. Air Force on nuclear weapons and nuclear missiles development, was in hospital for cancer treatment and made a decision to move from the Institute for Advanced Study in Princeton to the University of California but, as in Part 5 (ii), the move did not materialize as von Neumann soon died in February 1957 – in the year Fairchild Semiconductor was founded.

Had von Neumann moved, it would likely have been to UCLA in Southern California rather than UC Berkeley in Northern California, as reviewed in Part 5 (ii): with Southern Californian military aerospace companies active in computer development, and the Santa Monica-based Cold War think-tank RAND Corporation having him as a leading strategist and having built the JOHNNIAC computer named for him, von Neumann and RAND could have started a ‘Computer Beach’ there – at a time when Northern California’s Silicon Valley-founding Hewlett-Packard did not yet have computer development in its vision.

A few years later in 1961 Aiken’s new company, Howard Aiken Industries, was founded and named for himself similarly to William Shockley’s Shockley Semiconductor Laboratory; but while Aiken’s was ambitiously broader in its industry scope, it was neither in computer development – the field of Aiken’s academic prominence – nor in Silicon Valley.

Hurd later appeared to blame Aiken’s interest in getting rich for their, or perhaps just Hurd’s, not moving forward with their plan to start a microcomputer company, as previously quoted in Part 5 (ii):

“… Hurd said that he had never discussed this matter with Aiken, but that on two or three occasions when Aiken was in California, where he was a regular consultant for the Lockheed Missile and Space Division, the two of them had “talked at great length about organizing a company.” “If we had done it and if it had been successful,” Hurd mused, “it would have been the first microcomputer computer company in the world.” Hurd told me that “an Assistant Director of Engineering at Lockheed . . . was doing the design work,” and that “Howard, along with that man and me” would form the new company. Aiken, Hurd continued, “wanted me to help raise the money.” They “never followed through” with this plan. “I thought that maybe he wanted to be rich,” Hurd concluded, “and was thinking about starting the company for that reason.””

(I. Bernard Cohen, Howard Aiken: Portrait of a Computer Pioneer, 2000, The MIT Press)

So Aiken “wanted to be rich” and was “thinking about starting the company for that reason”. But what if a scientist wanted to put innovative work into commercial industrial production and needed to start a company for this purpose?

In 1957 the 8 young men abandoning William Shockley wanted exactly that, “more immediate opportunities for producing silicon transistors” as quoted earlier.

Unfortunately, the “Shockley Eight” found it very difficult to get financing, and their adventure nearly failed as their rebellion against their Nobel laureate mentor made them unacceptable to the investment firms that otherwise might take a chance on them – until eventually their case was brought to the attention of a wealthy and prominent playboy businessman, Sherman Fairchild: 

“Most of the transistor entrepreneurs had been backed by family money or other private capital resources. Arthur Rock at Hayden, Stone soon came to appreciate why. Every company he approached on behalf of the group of eight turned the idea down flat, without even asking to meet the men involved. Some firms may have found the pith of the letter–please give a million dollars to a group of men between the ages of 28 and 32 who think they are great and cannot abide working for a Nobel Prize winner–unpalatable. …

Undaunted, Bud Coyle mentioned the scientists to playboy-millionaire-inventor Sherman Fairchild. A meticulous man in his sixties, Fairchild was a bon vivant who frequented New York’s posh 21 Club and wore “a fresh pretty girl every few days like a new boutonniere,” according to Fortune. …

Sherman Fairchild was not involved with day-to-day operations at his companies, but he suggested to the senior management at Fairchild Camera and Instrument that it might explore the prospects of the West Coast technologists. …”

(Leslie Berlin, The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley, 2005, Oxford University Press)

Thus was born the Fairchild Semiconductor, a new company in the name of a famous playboy businessman who invested $1.5 million – twice as requested by the 8 Shockley mutineers – on the condition that he later could choose to acquire the full ownership for a pre-agreed $3 million:

“Fairchild readily put up $1.5 million to start the new company—about twice what the eight founders had originally thought necessary—in return for an option deal. If the company turned out to be successful, he would be able to buy it outright for $3 million.”

(Walter Isaacson, The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution, 2014, Simon and Schuster)

Sherman Fairchild was a good match as an investor for the Shockley Eight, because he himself was a scientific inventor, though not in the semiconductor field, and was the largest shareholder of IBM through family inheritance:

“It was a fine match. Fairchild, the owner of Fairchild Camera and Instrument, was an inventor, playboy, entrepreneur, and the largest single stockholder in IBM, which his father had cofounded. A great tinkerer, as a Harvard freshman he invented the first synchronized camera and flash. He went on to develop aerial photography, radar cameras, specialized airplanes, methods to illuminate tennis courts, high-speed tape recorders, lithotypes for printing newspapers, color engraving machines, and a wind-resistant match. In the process, he added a second fortune to his inheritance, and he was joyful spending it as he had been making it. …”

(Walter Isaacson, 2014, Simon and Schuster)

Like Aiken, Fairchild had attended Harvard, but never earned a degree from attending universities, including the University of Arizona and Columbia University.

(Frank and Suanne Woodring, Fairchild Aircraft, 2007, Arcadia Publishing)

In contrast to what Fairchild, a businessman and IBM’s largest shareholder living a playboy lifestyle, would easily do, namely financing the start of a new technology company, wasn’t Cuthbert Hurd, an IBM executive himself, too idealistic as reviewed in Part 5 (ii) or simply too harsh about Aiken’s interest in getting rich, when the first major semiconductor company of the fledgling Silicon Valley had already been founded in the name of someone famously wealthier, more debaucherous and less related to that company’s technological work?

Absolutely so, especially when Sherman Fairchild and his father George were the kind of legendary IBM figures like the bosses Hurd liked to talk about as in Part 5 (i), namely Thomas Watson father and son:

“… His father had preceded Tom Watson as the chief executive of the company that would become International Business Machines, and thanks to the vagaries of inheritance (Tom Watson had several children, while George Fairchild had only Sherman), he was the largest shareholder in IBM.”

(Leslie Berlin, 2005, Oxford University Press)

As the executive committee chairman on IBM’s board of directors, Sherman Fairchild immediately helped the new Fairchild Semiconductor get its first industry sales order – from IBM:

“Fairchild Semiconductor reached an important milestone on 2 March 1958, when it received a purchase order from IBM Owego accepting Fairchild’s quote to provide 100 core driver transistors at the price of $150 each. The firm had booked its first sale. With IBM’s purchase order, Fairchild Semiconductor instantly gained a measure of credibility in the electronics industry. IBM, a notoriously selective and demanding customer, had chosen to buy devices from the start-up. Helping Fairchild Semiconductor to secure this order from IBM was, reportedly, a timely visit by Sherman Fairchild (the founder of Fairchild Camera and Instrument and its majority owner) and Richard Hodgson to IBM’s president, Thomas Watson Jr. Managers at IBM Owego had concerns about Fairchild’s production capabilities and financial soundness. To overcome these reservations, Sherman Fairchild—who was also IBM’s largest individual shareholder and who chaired the executive committee of IBM’s board of directors—met with Watson and asked him to buy silicon transistors from the new venture. Fairchild Semiconductor received its order from IBM shortly thereafter.”

(Christophe Lécuyer and David C. Brock, with forward by Jay Last, Makers of the Microchip: A Documentary History of Fairchild Semiconductor, 2010, The MIT Press)

With the IBM management, of which Cuthbert Hurd was a part, so well accustomed to a wealthy son of an IBM co-founder wielding influences on the company’s board, how ‘impure’ in comparison was the early computer pioneer Howard Aiken’s goal of getting rich?

Given that Fairchild, an inventor and businessman, could easily help young scientists start a new technology company that would lay “the technological and cultural foundations of Silicon Valley”, as quoted earlier, and given that Shockley, a scientist inventor, had been able to start a new technology company, in my opinion Aiken, a scientist inventor turning into a businessman with a similar ambition, should be given the chance to try.

But as much as Hurd’s educational institutional career perspectives may have biased him toward Aiken’s interest in money as reviewed in Part 5 (ii), Hurd likely would have needed IBM’s support to co-launch a new computer company with Aiken.

Therefore I have to wonder if IBM, by the 1960s, still harboured trepidation about Aiken’s ambition: as in Part 5 (ii), Thomas Watson, Jr.’s late father and IBM-president predecessor had in 1944 witnessed the inclination of Harvard and Aiken to claim all the Mark I credits for themselves.

The Shockley Eight who founded Fairchild Semiconductor were viewed as the “Traitorous Eight” by William Shockley, but Shockley’s own difficult management style and orthodox scientific focus not only triggered their departure but also led to his company’s eventual failure. Spending the rest of his career as a Stanford professor, Shockley further descended into an openly “racist” scholar detested by many:

“Dubbed “the traitorous eight,” Noyce and his posse set up shop just down the road from Shockley on the outskirts of Palo Alto. Shockley Semiconductor never recovered. Six years later, Shockley gave up and joined the faculty of Stanford. His paranoia deepened, and he became obsessed with his notion that blacks were genetically inferior in terms of IQ and should be discouraged from having children. The genius who conceptualized the transistor and brought people to the promised land of Silicon Valley became a pariah who could not give a lecture without facing hecklers.”

(Walter Isaacson, 2014, Simon and Schuster)

Free of the difficult man as their boss, the Shockley Eight managed to start their new company at an incredibly right time; 3 days after Fairchild Semiconductor’s founding on October 1, 1957, the Soviet Union successfully launched the world’s first satellite, the Sputnik, spurring a fierce scientific and technological race in the United States to keep up with its Cold War nemesis:

“The traitorous eight who formed Fairchild Semiconductor, by contrast, turned out to be the right people at the right place at the right time. The demand for transistors was growing because of the pocket radios that Pat Haggerty had launched at Texas Instruments, and it was about to skyrocket even higher; on October 4, 1957, just three days after Fairchild Semiconductor was formed, the Russians launched the Sputnik satellite and set off a space race with the United States. The civilian space program, along with the military program to build ballistic missiles, propelled the demand for both computers and transistors. It also helped assure that the development of these two technologies became linked. Because computers had to be made small enough to fit into a rocket’s nose cone, it was imperative to find ways to cram hundreds and then thousands of transistors into tiny devices.”

(Walter Isaacson, 2014, Simon and Schuster)

In 1959 at Fairchild Semiconductor, Robert Noyce became an inventor – co-inventor independently with Jack Kilby of Texas Instruments – of the integrated circuit:

“When the transistor was invented in 1947 it was considered a revolution. Small, fast, reliable and effective, it quickly replaced the vacuum tube. …

With the small and effective transistor at their hands, electrical engineers of the 50s saw the possibilities of constructing far more advanced circuits than before. However, as the complexity of the circuits grew, problems started arising.

When building a circuit, it is very important that all connections are intact. …

Another problem was the size of the circuits. …

In the summer of 1958 Jack Kilby at Texas Instruments found a solution to this problem. …

… Kilby presented his new idea to his superiors. He was allowed to build a test version of his circuit. In September 1958, he had his first integrated circuit ready. It was tested and it worked perfectly!

Although the first integrated circuit was pretty crude and had some problems, the idea was groundbreaking. …

Robert Noyce came up with his own idea for the integrated circuit. He did it half a year later than Jack Kilby. Noyce’s circuit solved several practical problems that Kilby’s circuit had, mainly the problem of interconnecting all the components on the chip. … This made the integrated circuit more suitable for mass production. …”

(“The History of the Integrated Circuit”, May 5, 2003, Nobelprize.org)

I note that this milestone status of Robert Noyce’s co-inventing the integrated circuit matched his former mentor William Shockley’s co-inventing the transistor.

In 2000 when Kilby learned he was to be awarded the Nobel Physics Prize for inventing the integrated circuit, he immediately mentioned Noyce, who had died 10 years earlier:

“When Kilby was told that he had won the Nobel Prize in 2000, ten years after Noyce had died, among the first things he did was praise Noyce. “I’m sorry he’s not still alive,” he told reporters. “If he were, I suspect we’d share this prize.” ….”

(Walter Isaacson, 2014, Simon and Schuster)

Ironically, Noyce had died of a heart attack, on June 3, 1990 at the age of 62, in Austin, Texas, i.e., in the home state of Texas Instruments, Fairchild Semiconductor’s industry rival, where he led an American research consortium competing with the Japanese in the semiconductor industry:

““He was considered the mayor of Silicon Valley,” said Jim Jarrett, a spokesman for Intel. A founder of the Semiconductor Industry Association in 1975, Dr. Noyce was frequently in Washington to lobby on behalf of semiconductor manufacturers.

At the time of his death, Dr. Noyce was the president and chief executive of Sematech Inc., a research consortium in Austin that was organized by 14 corporations in an attempt to help the American computer industry catch up with the Japanese in semiconductor manufacturing technology.”

(“An Inventor of the Microchip, Robert N. Noyce, Dies at 62”, by Constance L. Hays, June 4, 1990, The New York Times)

The immediate boost the integrated circuit invention provided, in the early 1960s, was not to the commercial computer industry but to the U.S. military’s nuclear missiles development:

“The first major market for microchips was the military. In 1962 the Strategic Air Command designed a new land-based missile, the Minuteman II, that would each require two thousand microchips just for its onboard guidance system. Texas Instruments won the right to be the primary supplier. By 1965 seven Minutemen were being built each week, and the Navy was also buying microchips for its submarine-launched missile, the Polaris. With a coordinated astuteness not often found among military procurement bureaucracies, the designs of the microchips were standardized. Westinghouse and RCA began supplying them as well. So the price soon plummeted, until microchips were cost-effective for consumer products and not just missiles.”

(Walter Isaacson, 2014, Simon and Schuster)

As shown in the above quote, Texas Instruments quickly reaped big benefits from selling to the military missiles program.

In comparison, Fairchild Semiconductor carefully kept a distance from directly supplying military projects, instead gaining a major role supplying the U.S. civilian space program’s poster child, the Apollo missions to the moon:

“Fairchild also sold chips to weapons makers, but it was more cautious than its competitors about working with the military. In the traditional military relationship, a contractor worked hand in glove with uniformed officers, who not only managed procurement but also dictated and fiddled with design. Noyce believed such partnerships stifled innovation: “The direction of the research was being determined by people less competent in seeing where it ought to go.” He insisted that Fairchild fund the development of its chips using its own money so that it kept control of the process. If the product was good, he believed, military contractors would buy it. And they did.

America’s civilian space program was the next big booster for microchip production. In May 1961 President John F. Kennedy declared, “I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the earth.” The Apollo program, as it became known, needed a guidance computer that could fit into a nose cone. So it was designed from scratch to use the most powerful microchips that could be made. The seventy-five Apollo Guidance Computers that were built ended up containing five thousand microchips apiece, all identical, and Fairchild landed the contract to supply them. The program beat Kennedy’s deadline by just a few months; in July 1969 Neil Armstrong set foot on the moon. By that time the Apollo program had bought more than a million microchips.”

(Walter Isaacson, 2014, Simon and Schuster)

For the Apollo moon landing missions, Fairchild’s microchips achieved a major milestone by beating out IBM computers, which did not use integrated circuits, to be the choice of the onboard Apollo Guidance Computer, thanks to the perseverance of engineers at the Massachusetts Institute of Technology:

“… One of the most interesting examples of these decisions concerned the Apollo Guidance and Navigation system, controlled by the Apollo Guidance Computer. Due to size, weight, and power constraints, the Command and Lunar Modules would each carry only one computer, which had to work. What was more, the designers of the computer, at the MIT Instrumentation Laboratory, decided to build the computer using the newly-invented integrated circuit, or silicon “chip” as we now know it. … MIT’s decision did not go unchallenged. Early in the Apollo program, NASA contracted with AT&T to provide technical and managerial assistance for select technical issues. AT&T in turn established Bellcomm, an entity that carried out these analyses. In late 1962, Bellcomm recommended that IBM, not MIT, supply the computers for the Apollo Command and Lunar Modules. … Bellcomm’s recommendation was due in part to IBM’s role as supplier of the computer that guided the Saturn V rocket into Earth orbit and then to a lunar trajectory. The IBM Launch Vehicle Digital Computer did not use integrated circuits, but rather a more conservative circuit developed at IBM called “Unit Logic Device.” What was more, the circuits in the computer were installed in threes—so called “Triple Modular Redundancy” so that a failure of a single circuit would be “outvoted” by the other two.

The engineers at the MIT Instrumentation Lab mounted a vigorous defense of their design and were able to persuade NASA to not use the IBM computers in the Command and Lunar Module. … The Lab worked closely with Fairchild Semiconductor, the California company where the integrated circuit was invented, to ensure reliability. Chips were tested under rigorous conditions of temperature, vibration, contamination, and so on. If a chip failed these tests, the entire lot from which it came from was discarded. … Although Fairchild was offering a line of chips that could be used to make a computer, MIT chose only one type, giving them an ability to test it more thoroughly and to allow the manufacturer to build up more experience making them reliably. No Apollo Guidance Computer, on either the Command or Lunar Modules, ever experienced a hardware failure during a mission.

MIT did not entirely prevail, however, as NASA specified that primary navigation for Apollo would be conducted from Houston, using its array of large mainframe computers (supplied by IBM), with the on-board system as a secondary. The wisdom of that decision was proven during Apollo 13 when the Command Module’s power was lost. In other missions, the on-board computers and navigation systems worked perfectly and worked more in tandem with Houston than as a backup. It also functioned reliably during the burns of the Service Module engine behind the Moon, when there was no communication with Houston. …”

(“Apollo Guidance Computer and the First Silicon Chips”, by Paul Ceruzzi, October 14, 2015, Smithsonian National Air and Space Museum)

In the end, the Apollo Guidance Computer microchips designed by Fairchild Semiconductor were mass-manufactured by Philco, a company based in Philadelphia; nonetheless, the growth of Fairchild Semiconductor and the growth of Silicon Valley benefited significantly from the Apollo space program:

“… Grumman Aerospace, the builder of the Lunar Module, insisted that a small back-up controller be installed in case of a computer failure. Grumman envisioned this “Abort Guidance System” (AGS) as a modest controller intended only to get the crew off the Moon quickly and into Lunar Orbit, where they would be rescued by the Command Module pilot. As finally supplied by TRW Inc., it grew into a general-purpose computer of its own, with its own display and keyboard. Like the Apollo Guidance Computer, it also used integrated circuits. It was tested successfully during the Apollo 10 mission, but it was never needed.

… The area of Santa Clara County, where Fairchild and its competitors were located, began going by the name “Silicon Valley” by the end of the decade. The Apollo contract was not the sole reason for the transformation of the Valley, but it was a major factor. In truth, Fairchild ended up not being the main supplier of Apollo chips after all. Their design was licensed to Philco of suburban Philadelphia, which supplied the thousands of integrated circuits used in all the Apollo Guidance Computers. And because the Abort Guidance System was specified a year or two after the Apollo Guidance Computer, its designers were able to take advantage of newer circuit designs, not from Fairchild but from one of its Silicon Valley competitors, Signetics. …”

(Ceruzzi, October 14, 2015, Smithsonian National Air and Space Museum)

As summarized in the above two quotes, the Apollo program’s use of computers consisted of: 1) IBM mainframe computers in the Houston control center; 2) MIT-developed Apollo Guidance Computers using Fairchild-designed integrated-circuit chips manufactured by Philadelphia-based Philco, onboard the Command and Lunar Modules; and 3) TRW-developed computers, with microchips designed by another Silicon Valley semiconductor company Signetics, for the emergency Abort Guidance System.

As in Part 5 (ii), Philco was a company that produced transistors and computers for the military and for the commercial market, a company where Saul Rosen, a University of Pennsylvania Ph.D. and former Wayne State University professor, had worked before becoming one of the 2 initial founding members of Purdue University’s computer science department – the first such department of any U.S. university.

Also as in Part 5 (ii), Sam Conte, Purdue computer science department’s founding chairman and Rosen’s former Wayne State colleague, had worked at the military aerospace company Space Technology Laboratories; STL was a part of TRW, which was the leading contractor for the U.S. Air Force’s intercontinental ballistic missiles development.

(“Former TRW Space Park, now Northrop Grumman, designated as historic site for electronics and aerospace work”, by John Keller, December 18, 2011, Military & Aerospace Electronics)

The fact that 2 of the technology companies where the 2 Purdue computer science department founding members had worked later played key roles for the Apollo space program indicates that Purdue, in 1962 hiring these 2 former Wayne State professors to establish the first academic computer science department in the United States, had insight into the computer industry.

Fairchild Semiconductor kept a distance, in fact, from directly supplying military as well as other government contracts. In the company’s first 2 years, only 35% of sales were direct government purchases even though, in 1960 for instance, 80% of its transistors and 100% of its integrated circuits ended up in military use; by 1963, less than 10% of its business was direct contracts with the government; as also quoted earlier, Fairchild Semiconductor did not use government funding for its research and development, despite other companies’ use of government money as their primary source of R&D funding:

“Ever since Fairchild’s inception, the focus on innovation had led the company to reject most direct government contract work. Of course, Noyce knew that without the government—specifically, the Department of Defense—Fairchild Semiconductor would not exist. In the company’s first two years, direct government purchases accounted for 35 percent of Fairchild Semiconductor’s sales, and well over half of the company’s products eventually found their way into government hands. The multimillion-dollar Minuteman contract for transistors cemented the company’s success, and the vast majority of Fairchild’s other early customers were aerospace firms buying products to use in their own government contract work. In 1960, 80 percent of Fairchild’s transistors went to military uses, and fully 100 percent of the company’s early integrated circuits were used in defense functions as well. The company worked closely with military contractors in designing and building its products. …

Though Noyce welcomed the government as a customer and appreciated that federal mandates—such as one issued in April 1964, that required all televisions be equipped with UHF tuners, a law that effectively forced the introduction of transistors into every television in the United States—could benefit Fairchild Semiconductor, he believed there was something “almost unethical” about using government contract money to fund R&D projects. “Government funding of R&D has a deadening effect upon the incentives of the people,” he explained to a visitor in 1964. “They know that [their work] is for the government, that it is supported by government dollars, that there is a lot of waste. This is not the way to get creative, innovative work done.” …

… “A young organization, especially in the electronics industry has to be fast moving,” he explained in 1964. “It runs into problems with the unilateral direction mandated by government work.” By this point, the company was relatively well established, and Noyce reminisced, “We were a hard, young, hungry group. [Our attitude was] ‘We don’t give a damn what [money] you have [to offer], buddy. We’re going to do this ourselves.’” Gordon Moore shared Noyce’s beliefs. Consequently, while other firms in the early 1960s used government contracting as the primary source of R&D funding, less than 10 percent of business at Fairchild was contracted directly by the government in 1963. “And we like it that way,” Noyce hastened to tell a reporter.”

(Leslie Berlin, 2005, Oxford University Press)

As quoted earlier from a book by Walter Isaacson, in the early 1960s the U.S. military’s strategic nuclear missiles that immediately benefited from the invention of the integrated circuit were the land-based Minuteman II and the submarine-based Polaris.

Polaris was developed and produced by Lockheed Missiles and Space Company founded in 1956 in Sunnyvale of the nascent Silicon Valley region, as cited in a quote in Part 5 (ii) – a company for which Howard Aiken was a regular consultant for over a decade until 1973, the year of his death, as in Part 5 (ii).

But Aiken’s relationship with Lockheed had started before the creation of this missiles and space branch, with his consulting for the military aircraft manufacturer Lockheed Corporation in Los Angeles, according to an interview of him by Henry Tropp and Bernard Cohen in February 1973 – just over 2 weeks before his death as in Part 5 (ii):

“TROPP:

Well I’ve run across some interesting unpublished documents, many of them in another environment that’s related, in terms of what happened in the computer revolution, and had similar occurrences to Harvard, because people were starting clean. That’s the aircraft industry primarily in the Los Angeles area. Communications were different in that period. The East Coast had its computing environment and the West Coast tended to be separate and distinct and they almost grew up by themselves. But there were links back and forth and one of the questions that I was going to ask you was who were some of the people from that aerospace industry who visited the Harvard Computational Lab?

AIKEN:

The first man that I can think of is Louis Ridenhauser, with whom I was very closely associated. Louis was the Vice-President of Lockheed and he almost clubbed the Board of Directors of Lockheed into starting electronics machinery. And I was associated with him in that venture and was a Lockheed consultant for many years. I was always hopping out there to Los Angeles for a week nearly every month.

Then, I was in and out of Los Angeles for the Bureau of Standards operation.

TROPP:

That’s right. That was at UCLA. Did people come from Northrop and from Hughes in the late forties? Did any of that group come to Harvard?

AIKEN:

I saw Lehmer very frequently.

TROPP:

How about Harry Huskey? Did he come?

AIKEN:

Yes, yes. He was at our place. In fact, he attended one of these symposia.”

(“Interviewee: Howard Aiken (1900-1973) Interviewers: Henry Tropp and I.B. Cohen”, February 26-27, 1973, Smithsonian National Museum of American History)

Per his recollection as above, Aiken consulted for Lockheed in close association with then Lockheed vice president Louis Ridenhauser, who convinced the Lockheed board of directors to start “electronics machinery”, and was a very important aerospace industry figure among the visitors to Aiken’s Harvard Computation Lab; Aiken went to Lockheed in Los Angeles for a week of consulting in nearly every month, and was then in and out of Los Angeles for “the Bureau of Standards operation” at UCLA where he saw Lehmer very frequently; Harry Huskey also visited his Harvard lab.

As in Parts 5 (i) & (ii), UC Berkeley computational mathematicians Derrick Lehmer and Harry Huskey had been with the Institute for Numerical Analysis, located at UCLA and managed by the National Bureau of Standards, where Lehmer was the director in the early 1950s and Huskey led the development of its SWAC computer completed in 1950; the INA was terminated in 1954 due to McCarthyism-type politics, Lehmer returned to Berkeley in August 1953 and in 1954 recruited Huskey to UC Berkeley.

By this timeline, Aiken first started as a Lockheed consultant before INA’s closure in 1954, helping Lockheed getting into “electronics machinery” development.

I would not doubt that Howard Aiken had frequented the INA as recalled in his 1973 interview; however, in the comprehensive 1991 book by Magnus R. Hestenes and John Todd on INA history, extensively quoted in Parts 5 (i) & (ii), there exists only one reference to Aiken, and in a negative light – about his opposition to the NBS’s development of computers, as recalled by Harry Huskey (Aiken name underline emphasis added):

“The success of the ENIAC had excited mathematicians and other scientists to the possibilities now opening before them. … Government agencies, quick to see the potentials of an electronic computer, were eager to acquire one. However, the field was new, there was no background of experience… Therefore, government agencies were glad to ask the NBS to assist them in negotiating with computer companies. In early 1948, the Bureau had begun negotiating with the Eckert-Mauchly Computer Corporation and the Raytheon Corporation, and later with Engineering Research Associates.

The computers were slow in being developed. New techniques were being tried and often they did not work as well, or as soon, as had been first thought, or hoped. The personnel of the Applied Mathematics Laboratories became impatient with this slow development, and decided that they could build one faster with the help of the Electronics Laboratory at the Bureau. Also, it had become clear that in order to be able to judge effectively the probability of a new technique working they would need more “hands-on” expertise. Dr. Edward Cannon and the author convinced Dr. Curtiss that this “gamble” was worth trying, and Dr. Mina Rees of the Office of Naval Research backed them up. This was in spite of the advice of a committee, consisting of Dr. George Stibitz, Dr. John von Neumann, and Dr. Howard Aiken, which had been asked by Dr. Curtiss to consider the Bureau’s role in the computer field. Their advice had been that the NBS shouldn’t really work on computers, but should confine its work to improving components.

In May 1948, the decision was made at the Executive Council to build a machine for the Bureau’s own use in Washington. … at the October 1948 meeting of the Executive Council it was decided that the Bureau should build a second computer at the Institute for Numerical Analysis, which had by now been located in a reconverted temporary building on the campus of the University of California at Los Angeles. This machine was to be built under the direction of the author, who had joined Curtiss’s group in January 1948. He had spent the previous year at the National Physical Laboratory in Teddington, England, working under Alan Turing with James Wilkinson and others on the Automatic Computing Engine (ACE) project. He had been offered the job there on the recommendation of Professor Douglas Hartree, whom he had met while working on the ENIAC project.”

(“The SWAC: The National Bureau of Standards Western Automatic Computer”, by Harry D. Huskey, in Magnus R. Hestenes and John Todd, Mathematicians Learning to Use Computers: The Institute for Numerical Analysis UCLA 1947-1954, 1991, National Institute of Standards and Technology, U.S. Department of Commerce)

As reviewed in Part 5 (i), the NBS built 2 computers in 1950, the SEAC computer in Washington, D.C. and the SWAC computer at the INA at UCLA; both were electronic computers, and influenced by John von Neumann’s design, including the “stored program” concept he advocated. The INA operation was overseen by John Curtiss cited above, who was head of the NBS’s applied mathematics division and Mina Rees, cited above as supportive of NBS’s interest in computer development, was the head of the mathematics division at the U.S. Navy’s Office of Naval Research, as in Part 5 (ii).

And as Huskey recalled in the quote above, to get into computer development the NBS had to go against the advice of a committee of 3 prominent computer pioneers, including Aiken and von Neumann.

Of the 3 committee members, Howard Aiken and George Stibitz were part of a powerful scientific establishment, led by the U.S. government’s leading science adviser Vannevar Bush discussed about in Parts 5 (i) & (ii), that had strongly opposed the development of the first general-purpose electronic computer ENIAC during World War II – the project von Neumann, the other committee member, had played a key role in – and continued to oppose later electronic computer projects:

“The decision to build the ENIAC resulted from the Army’s pressing wartime needs. The established scientific community within the government’s Office of Scientific Research and Development, headed by Vannevar Bush, fiercely opposed the Eckert-Mauchly project. At issue were both the digital design of their proposal (an advanced analog differential analyzer was being built at MIT at the time) and their proposed use of electronic circuit elements. Samuel H. Caldwell, Bush’s MIT colleague, wrote, “The reliability of electronic equipment required great improvement before it could be used with confidence for computation purposes.” Stibitz, at the Bell labs, expressed similar sentiments and suggested using electromechanical relay technology.

After the war, when Eckert and Mauchly turned to more advanced designs, opposition from established figures in computing continued. Howard Aiken, who had struggled at Harvard before the war to seek support for constructing a large electromechanical calculator, opposed their projects, …

Well before ENIAC was finished, Eckert and Mauchly’s group began thinking about a new and improved machine. …

By then, an important addition to the Moore School team had been made. The famous mathematician John von Neumann had learned of the existence of the project and had immediately begun regular visits. Von Neumann, a superb mathematician, had become a powerful voice in the scientific establishment that ran the U.S. war effort’s research and development program. …

Von Neumann’s presence stimulated a more formal and rigorous approach to the design of the successor machine, dubbed the Electronic Discrete Variable Automatic Computer (EDVAC). …”

(Kenneth Flamm, Creating the Computer: Government, Industry and High Technology, 1988, Brookings Institution)

As described above, the powerful U.S. scientific establishment’s leading conservative views were very much like Aiken’s reviewed in Part 5 (ii), namely that electronic components were unreliable, and that the Aiken type of electromechanical delay machines should be the choice instead.

Even decades later, in the 1973 interview shortly before his unexpected death, Aiken continued to make the same point, that electronic computers built with vacuum tubes were “absolutely worthless” because they were unreliable, asserting that electronic computers became successful really only after transistors had become available:

“AIKEN:

Yes. You see, a computing machine is not reliable. It’s worthless, absolutely worthless. You can’t trust the results. And that was the reason that we checked everything that we did. Even today, people don’t go to all lengths to check what they did. … Well, we were conscious that we had ________ and the ENIAC didn’t. We had program facilities which they didn’t. We had input and output which they didn’t bother to worry about. … So that we used to say, “What’s all this speed for? What does it accomplish? We get there sooner.”

And yet, you know, I really question if electronic computation would ever have become a great success had it not been for the transistor.”

(February 26-27, 1973, Smithsonian National Museum of American History)

So, even assuming that in 1948 John von Neumann was sympathetic toward the NBS’s intent on developing computers, his view was unfortunately in the minority on the committee making the recommendation – with Howard Aiken and George Stibitz staunch opponents – just like his view having been in the minority among the leading circle of the U.S. scientific establishment in their attitudes toward the ENIAC project.

Nonetheless, like the U.S. Army had done starting the ENIAC project during World War II, in 1948 the NBS decided against the expert committee’s negative advice and started the SEAC computer project in Washington, D.C. and the SWAC computer project at the INA at UCLA.

As in Part 5 (i), von Neumann was among the “distinguished visitors” at the INA at UCLA as reported in the book of Hestenes and Todd; but that detailed INA history account published in 1991 made no other mention of Aiken, i.e., other than his on the committee opposing NBS computer development – despite his own claim in the 1973 interview that he had frequented INA.

While I do not doubt the validity of Aiken’s claim, it apparently has been ignored by former INA members who have told their stories in the 1991 book. Given Aiken’s prominence in the computing field nearly rivaling von Neumann’s, I would infer that he frequented Los Angeles as a computer consultant for Lockheed as recalled in his 1973 interview, and while there also visited INA in an informal capacity, and therefore his visits have not been mentioned in the more recent book on INA history.

But now a related question arises: with his frequent consulting trips to Lockheed for many years since the early 1950s, what contributions did Howard Aiken actually make to Lockheed in electronic computer development – or “electronics machinery” as he called it in the 1973 interview quoted earlier?

The answer may be that, at least during his consultancy in the 1950s, Aiken did not help Lockheed develop any electronic computer.

Firstly, given Aiken’s conservatism, i.e., his viewing electronic computers as “absolutely worthless”, he was slow to embrace the vacuum tube technology for electronic computers even though, as in Part 5 (ii), his 1930s’ Harvard Ph.D. research had been in vacuum tubes.

Secondly, as quoted earlier from Nobelprize.org on the history of the integrated circuit, the transistor had only been invented in 1947, and despite the story previously cited in Part 5 (ii) that Aiken used some transistors with Mark III and Mark IV, in his 1973 interview Aiken seemed to deny it, stating that he had never utilized the transistor, and that their quality had been very bad:

“COHEN:

But this was still vacuum tube.

AIKEN:

Mark III was. You see, I never had anything to do with the transistor.

AIKEN:

The first transistors we purchased in the computer laboratory were purchased in France._____________________________________, and they were yea big, and they looked like little plastic things with three wires sticking out of them. That happens to be exactly what they were. They sold those damn things in a number of plastic molds with three wires sticking out of the thing; they had no circuit capability whatsoever at all. And like everybody else, we didn’t find that out until we paid for it.

Then I wrote to the United States Chamber of Commerce in Paris to complain, and several other people must have done the same thing because I got a letter back, saying, “Well, these people are no longer in business.” But they ripped quite a few people with their technique.”

(February 26-27, 1973, Smithsonian National Museum of American History)

Thirdly, in Part 5 (i) I have reviewed the computer development activity of the late 1940s and early 1950s at aerospace companies in Southern California, and Lockheed is not mentioned in that history, as previously quoted in Part 5 (i):

“One other pocket of activity, in historical hindsight, looms in importance as a transporter of computer technology from laboratory to market. Located on the West Coast of the United States and tied closely to the aerospace industry in Southern California, which, in turn, was very dependent on government contracts, this activity focused on scientific and engineering computing. The design of aircraft inherently required extensive mathematical calculations, as did applications such as missile guidance. Early efforts (late 1940s) were primarily housed at Northrop Aircraft and to a lesser extent at Raytheon. Both had projects funded by the U.S. government: Northrop for its Snark missile and Raytheon for a naval control processor, for example. Northrop worked with an instrument supplier (Hewlett-Packard) on early digital projects. Then, in 1950, a group of Northrop engineers formed their own computer company called Computer Research Corporation (CRC). Like ERA, it had a military sponsor the U.S. Air Force for which it built various computers in the first half of the 1950s.”

(James W. Cortada, The Computer in the United States: From Laboratory to Market, 1930 to 1960, 1993, M.E. Sharpe)

As quoted, aircraft design and missile guidance were two applications that “required extensive mathematical calculations” and would benefit from computing power.

But as it appeared in the above history account, Northrop and Raytheon were centers of computer development activity in that earlier period, but likely not Lockheed.

However, Lockheed did purchase 2 IBM 701 computers – IBM’s first commercial computer, the development of which Cuthbert Hurd played a key role for as in Parts 5 (i) & (ii) – for its Southern Californian aircraft business in 1953 and 1954 – No. 3 and No. 18 on a full list of 19 IBM 701 computers produced per IBM’s archival records, including No. 1 for IBM itself.

(“701 Customers”, International Business Machines Corporation)

And fourthly, despite Aiken’s consultancy for Lockheed in Los Angeles on “starting electronics machinery”, in 1956 when Lockheed Missiles and Space Division was founded in the fledgling Silicon Valley region, the Lockheed engineers there relied solely on mechanical calculators as previously quoted in Part 5 (ii):

“The Bayshore Freeway was still a two-lane road, and 275 acres of bean fields adjacent to Moffett Field were purchased in 1956 to become the home of Lockheed Missile & Space Division (now Lockheed Martin Space Systems Company). …

… the first reconnaissance satellite, called Corona, and the Polaris submarine-launched ballistic missiles (SLBMs) were designed and built in just a few short years by the company’s engineers and scientists — armed only with slide rules, mechanical calculators, the basic laws of physics and an abundance of imagination.”

(“Lockheed grew up with Sunnyvale”, Myles D. Crandall, February 25, 2007, Silicon Valley Business Journal)

Given the four aspects of evidence listed above, encompassing Aiken’s own dismissive attitudes toward electronic components, Lockheed’s lack of computer development activity and the absence of computer use in the early years at Lockheed Missiles and Space Company in Sunnyvale where Aiken became a consultant at some point, Lockheed most likely did not develop electronic computers during the 1950s and Aiken did not help in that regard.

Then the next logical question is: what did Howard Aiken actually do in his consulting for Lockheed, which he later referred to in his 1973 interview, quoted earlier, as the venture of “starting electronics machinery”?

An answer may be that Aiken helped with developing special circuits for the application of missile guidance, i.e., for the Polaris nuclear missiles developed and produced by Lockheed Missile and Space Company.

This is because, as quoted earlier, in the early 1960s the Polaris missiles utilized a lot of microchips after the integrated circuit’s invention by Texas Instruments’ Jack Kilby and Fairchild Semiconductor’s Robert Noyce.

In fact, the title of a 1962 conference co-organized by Aiken and William Main – a person Aiken and Hurd wanted to start a computer company with in 1970 as mentioned in Part 5 (ii) – and hosted by the Lockheed Missiles and Space Company in Sunnyvale, previously cited in Part 5 (ii), referred to “Switching theory in space technology”.

(Howard Aiken and William F. Main, eds., Switching theory in space technology: [Symposium on the Application of Switching Theory in Space Technology, held at Sunnyvale, California, February 27-28 and March 1, 1962], 1963, Stanford University Press)

What is switching theory? It is the theory of circuit switching of data for rapid decision making – like the computer’s logical functioning but not necessarily requiring a general-purpose computer to accomplish:

Switching theory, Theory of circuits made up of ideal digital devices, including their structure, behaviour, and design. It incorporates Boolean logic (see Boolean algebra), a basic component of modern digital switching systems. Switching is essential to telephone, telegraph, data processing, and other technologies in which it is necessary to make rapid decisions about routing information.”

(“Switching theory”, by the Editors of Encyclopædia Britannica, Encyclopædia Britannica)

One can understand that “rapid decisions” were critically needed on flying missiles as they were on a telephone network.

In his 1973 interview, Aiken emphasized that switching theory did not have to rely on transistors, because the switching circuits could also be made of mechanical relays or vacuum tubes:

“AIKEN:

This was the beauty of switching theory, you see. You had switches and you could take a relay or a vacuum tube or a transistor or anything else, a magnetic core, and make diagrams and draw pictures for a machine completely on the logic of it. …”

(February 26-27, 1973, Smithsonian National Museum of American History)

So the various pieces of evidence suggest that in the 1950s before the integrated circuit’s invention, Howard Aiken did not get to develop electronic computers as a Lockheed consultant but helped the company develop switching circuits for missile guidance, probably using electromechanical relays at first – his conservative Harvard Mark I specialty – and later using integrated circuits during the 1960s.

In an earlier quote from Walter Isaacson’s book on innovators in digital revolution, there was the statement that, “computers had to be made small enough to fit into a rocket’s nose cone”.

Regarding that is an important point of distinction related to Aiken’s recognized contributions to the computer field: those “computers” were not necessarily general-purpose computers that one commonly knows, i.e., not of “von Neumann architecture” as discussed in Part 5 (i) but special switching circuits built with electronic components; such circuits would certainly be “electronics machinery” as Aiken called them in his 1973 interview, and have sometimes been referred to as computers of “Harvard architecture”, also known as “Aiken architecture” as previously quoted in Part 5 (ii):

“… Aiken is sometimes held to be reactionary because he was always wary of the concept of the “stored program” and did not incorporate it into any of his later machines. This stance did put him out of step with the main lines of computer architecture in what we may call the post-Aiken era, but it must be kept in mind that there are vast fields of computer application today in which separate identity of program must be maintained, for example, in telephone technology and what is known as ROM (“read-only memory”). In fact, computers without the stored-program feature are often designated today (for instance, by Texas Instruments Corporation) as embodying “Harvard architecture,” by which is meant “Aiken architecture.””

(“Howard Hathaway Aiken”, by J. A. N. Lee, 1995, Institute of Electrical and Electronics Engineers Inc.)

Unfortunately for Howard Aiken, in 1959 just as the integrated circuit was invented and a giant leap of technological advancement would soon take place in the applications of switching circuits and computers in the missile and space fields, his close friend and Lockheed sponsor, Lockheed vice president Louis Ridenhauser who had gotten the company’s board to start “electronics machinery”, as cited earlier, unexpectedly died as Aiken later recalled in his 1973 interview:

“AIKEN:

Have you ever heard Louis Ridenhauer’s definition of an airplane company? He says it’s a place where well-informed brash old men sit around all day in conference discussing irrational things. I was very, very fond of Louis. I was with him the evening he died, in Washington. It was a very unusual thing.

There was a young man who was interested in very low temperature devices in computers— what was his name?

TROPP:

I don’t know.

COHEN:

I do.

AIKEN:

Well at any rate, he and Ridenhauer and I talked ____________________________________________, and we went to Washington. I met Ridenhauer and we bummed around Washington all evening with him. And the last thing he said to me before he left was, “You’re going to the Cosmos Club. I wish I was going to the Cosmos Club but I can’t go because I’m now Vice-President of Lockheed and I’ve got that two-room suite,” and he shoved off and ________________________________________. And the next morning, I got up at a quarter to eight and Ridenhauer didn’t show up and this other man didn’t show up. Somebody wanted to know where they were and I said, “Well don’t worry about Louis, he probably has a hangover this morning.” And a couple hours later, the Manager of the Statler called up looking for me and telling me that Louis had died in bed. And almost immediately after that, we got a telegram that ________ was dead, so we just folded the meeting and everybody went home. It didn’t seem very worthwhile to go on.

COHEN:

You know who he was, don’t you?

AIKEN:

He was Dean of the Graduate School at the University of Illinois when he was 28 or some thing like that, after having been Chief Scientist in the United States Air Force.”

(February 26-27, 1973, Smithsonian National Museum of American History)

Recall as in Part 5 (ii) that both Howard Aiken, who passed away at 73 in a St. Louis hotel during a March 1973 consulting trip at Monsanto, and his former Harvard underling, U.S. Navy Rear Admiral Grace Murray Hopper who passed away at 86 on New Year 1992, died while in their sleep.

Thus in his February 1973 interview two weeks before his own unexpected death, Aiken unwittingly told of an earlier precedent as quoted above, that his Lockheed vice-president friend Louis Ridenhauser had died in bed in a 2-room hotel suite in Washington, D.C., after a night of drinking with Aiken and another man.

By Aiken’s description, his friend Ridenhauser, a former “Dean of the Graduate School at the University of Illinois ” and former “Chief Scientist in the United States Air Force”, was probably quite young when he died.

The Aiken friend’s age and year of death can be found from a different source, where his name is identified as Louis Ridenour, and the biography includes his positions at the University of Illinois and at the U.S. Air Force similar to what Aiken said:

“1911, Nov. 11

Born, Montclair, N. J.

1932

B.S. in physics, University of Chicago, Chicago, Ill.

1936 1938

Instructor in physics, Princeton University, Princeton, N.J.

1938

Ph.D. in physics, California Institute of Technology, Pasadena, Calif.

1938 1941

Professor of Physics, University of Pennsylvania, Philadelphia, Pa.

1941 1944

Assistant director, Radiation Laboratory, Massachusetts Institute of Technology, Boston, Mass.; headed team that developed SCR 584 gun laying radar

1942 1946

Consultant to the secretary of war; field duty as chief radar advisor in North African, European, and Pacific theaters of World War II

1947 1951

Professor of physics and dean of graduate college, University of Illinois, Urbana, Ill.

Editor-in-chief, Radiation Laboratory series of technical books

1948 1952

Chairman, scientific advisory board to the chief of staff, United States Air Force

1951 1954

Vice president, International Telemeter Corp., Los Angeles, Calif.

1952 1953

Visiting professor of engineering, University of California, Los Angeles, Calif.

1955

Director of program development, Missile System Division, Lockheed Aircraft Corp., Burbank, Calif.

1955 1957

Director of research, Missile System Division, Lockheed Aircraft Corp., Burbank, Calif.

1957 1959

Assistant general manager and chief scientist, Lockheed Aircraft Corp., Burbank, Calif.

1959

Vice president and general manager, Electronics and Avionics Division, Lockheed Aircraft Corp., Burbank Calif.

1959, May 21

Died, Washington, D.C.”

(“Ridenour, Louis N. (Louis Nicot), 1911-1959”, Social Networks and Archival Context)

Louis Ridenour was only 47 when he died in May 1959, several years younger than John von Neumann, another top scientific adviser to the U.S. Air Force, at 53 at the time of death in February 1957.

The timing of Ridenour’s death, i.e., in the year he was promoted to a Lockheed vice presidency, is reminiscent of the 1997 death of Diana Forsythe as in Part 5 (ii), whose late father George had been Stanford computer science department’s founding chairman and the most influential person in the emergence of computer science as an academic discipline: Diana drowned during a Alaska backpacking trip in the year she was given an associate professorship at the University of California, San Francisco, following a string of short-term academic jobs in the Stanford area.

Aiken’s 1973 interview quoted earlier mentioned that another person died around that time, whose name has not been disclosed in the online copy of the interview cited.

As a matter of fact, 3 days after Ridenour’s May 21 death in Washington, D.C., a prominent American died in his sleep in the U.S. capital area, at the same hospital where John von Neumann had died; but like von Neumann and unlike Ridenour, U.S. Secretary of State John Foster Dulles had been stricken of cancer:

Washington, May 24—John Foster Dulles died in his sleep Sunday morning, 39 days after his resignation as secretary of state.

The 71 year old statesman succumbed to cancer complicated by pneumonia, at 7:49 a. m. The end came quietly. He had been comatose under pain-relieving drugs for more than a week.

At the bedside in the Presidential suite of Walter Reed army hospital were his wife of 47 years, Janet; his sons, John and Avery, the latter a Jesuit priest; his brother, Allen, and his sister, Eleanor.

Ike Leaves Farm

The news was flashed first to President Eisenhower, spending a week-end at his Gettysburg, Pa., farm. He cut short his stay and returned to Washington in mid-afternoon. …”

(“Succumbs in Sleep during Cancer Fight: Military Service Scheduled at Arlington”, by Willard Edwards, May 25, 1959, Chicago Daily Tribune)

As discussed in Part 2, the mathematician John Nash, like Dulles a Princeton University alumnus, was in his first psychiatric committal at the time of Dulles’s death, having failed in his attempts to start a world peace movement at MIT and been diagnosed as a “paranoid schizophrenic” instead.

But at least Dulles and Nash were trying hard on world politics, rather than drinking heavily.

With a Ph.D. degree in physics from California Institute of Technology in Southern California, Ridenour’s distinguished academic career included faculty positions at Princeton and University of Pennsylvania, wartime military research management at MIT, and as graduate dean at the University of Illinois, according to the biography quoted earlier; his chairmanship of the scientific advisory board to the Air Force chief of staff was what Aiken referred to as former U.S. Air Force “chief scientist”.

However, there is a time discrepancy in Ridenour’s time of service at Lockheed and Aiken’s recollection of it in his 1973 interview: Ridenour’s biography quoted indicates he had started working at Lockheed in 1955; but Aiken said in his 1973 interview, as quoted earlier, that he had visited Lockheed in Los Angeles regularly as a consultant and then also frequented the “Bureau of Standards operation” at UCLA, i.e., the Institute for Numerical Analysis, which as mentioned earlier was terminated in 1954, i.e., even before Ridenour had started working at Lockheed.

Perhaps Aiken was unclear in his recollection of the history with Ridenour: he also gave the wrong chronological order regarding Ridenour as graduate dean of Illinois and as chief scientist of the Air Force.

Or Aiken could have already been a consultant at Lockheed even before Ridenour’s arrival.

After his Air Force chief scientist stint Ridenour was a visiting professor of engineering at UCLA during 1952-1953, and the two could have connected there at the INA.

Regardless, consistent with my earlier analysis’s conclusion that Aiken’s consultancy work at Lockheed was helping develop switching circuits for missile guidance, his close friend Ridenour’s Lockheed management responsibilities were in the missile field from 1955 to 1957, as a director within the Missile System Division – I note that Lockheed Missiles and Space Company was founded in Sunnyvale during this period.

But Ridenour’s persuading the company board to start “electronics machinery”, a venture Aiken was “associated with” as described in Aiken’s 1973 interview, may have come later in 1957 when Ridenour became Lockheed chief scientist, and surely by 1959 when he became Lockheed vice president in charge of the Electronics and Avionics Division, not long before his sudden death which Aiken was also ‘associated with’.

Before his death, Ridenour had also played a key role in Lockheed’s winning an Air Force satellite contract that would help put Lockheed in a “commanding position in the aerospace market”:

“… Before his untimely death, Ridenour won for Lockheed the Air Force satellite contract that would contribute greatly in the years afterward to that firm’s commanding position in the aerospace market.”

(R. Cargill Hall and Jacob Neufeld, eds., The U.S. Air Force in Space, 1945 to the 21st Century, September 21-22, 1995, Proceedings, Air Force Historical Foundation Symposium)

I would think that this Air Force satellite contract was a business basis for Lockheed Missiles and Space Company’s sponsorship of a 1962 symposium co-organized by Aiken, cited earlier, on “switching theory in space technology”.

In what could be another, if partial, credit due Ridenour, in 1959, namely the year he was promoted to be Lockheed vice president and general manager of its Electronics and Avionics Division, Lockheed started an electronics business, taking over an engineering company in Plainfield, New Jersey, and renaming it Lockheed Electronics Company:

“In 1953, Stavid Engineering built an 80-acre industrial site that sits in the boroughs of Watchung and North Plainfield, N.J. Lockheed Corporation, a predecessor to Lockheed Martin Corporation, acquired the engineering company six years later.

From 1959 to 1989, Lockheed Electronics Company manufactured, tested and assembled electronic components at the site. Lockheed closed the operation in 1989, …”

(“NORTH PLAINFIELD, NEW JERSEY”, Lockheed Martin Corporation)

Rather ironically in light of the computer-pioneer rivalry between Aiken and von Neumann mentioned earlier and detailed in Part 5 (ii), in 1959 the new Lockheed Electronics Company was set up not in California where Aiken was a Lockheed consultant in Sunnyvale, but in New Jersey where von Neumann had led his IAS computer project in Princeton.

This history related to him, as reviewed so far, suggests that in 1961 when Howard Aiken took early retirement from Harvard to become a businessman in the industry, he had specific reasons in wanting to partner with Cuthbert Hurd to start a new computer company – reasons beyond his business ambition in general or his interest in making computers smaller as discussed in Part 5 (ii).

Firstly, after his years of consulting for Lockheed in “electronics machinery”, Lockheed was still not doing computer development, and its new electronics branch was set up in New Jersey, separate and away from California where Aiken did missiles-and-space consulting. The recent invention of the integrated circuit led to greater potential for computer development, which had been Aiken’s specialty and was likely more desirable to him than consulting on switching theory for missiles and space applications; Aiken’s predicament thus became similar to William Shockley’s focus on difficult telephone switching applications in 1957 when eight of Shockley’s disciples decided to split with him – except that in Aiken’s case it was not his choice but Lockheed’s decision.

Secondly, Aiken’s close friend, vice president Louis Ridenour at Lockheed headquarters had died in 1959, and now at Lockheed Missiles and Space Company Aiken socialized with persons lower in statue and lesser in influence, such as, as discussed in Part 5 (ii), George Garrett, Hurd’s former Oak Ridge National Laboratory colleague and Lockheed Missiles “Director of Computer Activities” according to Hurd, or “director of information processing” per Garrett’s obituary. Hurd, the director of Electronic Data Processing Machines Division at the computer company IBM, by this time would be an important connection to Aiken.

Thirdly, as discussed earlier, Hurd’s IBM management role meant the potential of support for their new microcomputer company venture by the leading computer company which had made Aiken’s Harvard Mark I project possible as in Part 5 (ii) – I note that IBM’s largest shareholder and influential board member Sherman Fairchild had helped Shockley’s “Traitorous Eight” found Fairchild Semiconductor in the fledgling Silicon Valley where Aiken was doing his Lockheed consulting.

And lastly, with the prominence of his computer pioneer status and now industry consulting expertise, if Aiken wanted to undertake computer development in the industry he could and should launch a new company, given that Lockheed had set up its electronics company at a faraway location, whereas “an Assistant Director of Engineering at Lockheed” in Sunnyvale was doing computer design work, as mentioned earlier, for starting a new computer company with him and Hurd.

So Howard Aiken must have felt that it was a good time to do it, except that the new computer venture did not then materialize.

The scenario put forth in Part 5 (ii) is that in the early 1960s Hurd was unhappy with what little Aiken would offer him, namely that Aiken wanted a new company of his own and wanted Hurd only to “help”, i.e., likely not giving Hurd a good share of the ownership – especially in light of the historical precedent of Harvard and Aiken the inventor attempting to claim sole credit for Mark I despite that it was built by IBM.

Indeed, even by the time of his 1973 interview shortly before his death Aiken continued to express scorn of IBM, about IBM’s lack of mathematical abilities at the time of the Mark I project, reminiscing that IBM personnel didn’t know how to calculate basic arithmetic division and that he had to invent a technique for it overnight in his hotel room:

“AIKEN:

I went up to Endicott after this began to be formalized at IBM over the years. During the conversations with Lake and Durfee about what kind of machine this was that we were talking about. It was, oh I guess I made eight or ten such trips before I learned that IBM didn’t know how to divide. And that was a terrible blow. They could add, they could multiply, but by god, they didn’t know how to divide. It was almost like the bottom had dropped out. Maybe I ought to get back to Monroe—they knew how to divide.

So I stayed at a hotel in Binghamton and I can’t remember the name of it. That night when I found out that they didn’t know how to divide, I was up nearly all night, and it was that night that I invented the technique of dividing by computing by reciprocals. This is a scheme by which you can compute reciprocals, knowing how to add and multiply and you know how to do it. You could add and multiply by computing reciprocals and all you need is a first guess, and a first guess, when you are dealing with an independent variable that proceeds by fixed intervals ________, the first guess is always the reciprocal of the last name, and the convergence is beautiful. You double the number of digits of precision at each iteration.

So it was around three o’clock in the morning when this all came clear. The next day I walked into IBM and said “Well, you don’t need to worry about not being able to divide because I know how to divide with an adder and a multiplier.” And I started to derive this expression using the ___________________. And it became very clear that this was a waste of time because these men couldn’t understand this. They knew no mathematics. Even high school algebra was too much for them.”

(February 26-27, 1973, Smithsonian National Museum of American History)

Aiken also asserted that had Thomas J. Watson, Sr. – IBM president in the era of the Mark I project as in Part 5 (ii) – not relinquished authority to his son soon enough, IBM would not have become a top computer company:

“TROPP:

No, but the vision didn’t start until much later than the SSEC, it was in the 1950’s.

COHEN:

Oh, I see. That’s their statement.

TROPP:

It started after the defense calculator came into being.

COHEN:

Oh, yes.

AIKEN:

IBM got going in the computer business when young Tom was made president.

TROPP:

That’s right.

AIKEN:

And as he told me one time, the first thing that he did when he became President of the Company, replacing his father, was to sense the embarrassment the Corporation faced because Remington Rand was getting all the credit for going ahead. So that if the senior Watson had remained active for say, another two years, chances are pretty good that Sperry Rand would be the big computer company today. It was changed by young Tom.”

(February 26-27, 1973, Smithsonian National Museum of American History)

In any case, as cited in Part 5 (i), Watson, Sr. died 2 months after handing over IBM’s reign to Watson, Jr. in 1956.

The “SSEC” mentioned in the above quote was IBM’s own version of a electromechanical relay machine, made in collaboration with Columbia University, after the company’s dispute with Harvard and Aiken over the Mark I credit:

“… the SSEC contained over 21,000 electromechanical relay switches with physically moving parts.

… In these features of its physical layout, quite apart from its hardware and system architecture, the SSEC was the ancestor of IBM mainframe systems to come—after the ASCC (Automatic Sequence Controlled Calculator) or Mark I, that is. IBM had built that machine with researchers at Harvard using a team led by Howard Aiken and including Grace Hopper. T.J. Watson, Sr. believed that IBM had subsequently been given insufficient credit for the machine. Competition with its own Harvard Mark I was the impetus for IBM’s building the SSEC—and the new computer was pointedly created through a new arrangement with another Ivy League institution, Columbia, in a collaboration between IBM engineering in Endicott, New York, and the newly formed (1945) Watson Scientific Computing Laboratory at Columbia.

The Lab at Columbia was directed by the first Director of Pure Science at IBM, Wallace Eckert (no relation to Presper Eckert, the well-known developer of the ENIAC), an astronomy professor at the university. In 1944 T. J. Watson, Sr. recruited Eckert as the first IBM employee with a PhD and Eckert helped to hire a team that included the second PhD at IBM, another astronomer, Herb Grosch, as well as Robert R. (“Rex”) Seeber, who had worked on the Harvard Mark I. …”

(Steven E. Jones, Roberto Busa, S. J., and the Emergence of Humanities Computing: The Priest and the Punched Cards, 2016, Routledge)

The above-quoted history account does corroborate that at the time of the Mark I project IBM personnel likely knew very little advanced mathematics or advanced science: only after the slight by Harvard and Aiken did IBM hire its first two Ph.D.s – recruited through Columbia.

As in the last quote earlier from Aiken’s 1973 interview, after the SSEC machine IBM developed the “defense calculator”. As in Part 5 (i), the IBM 701 Defense Calculator was IBM’s first commercial line of general-purpose computer, made during Watson, Sr.’s time; as cited earlier, Lockheed bought 2 of them in 1953 and 1954; and most importantly as in Part 5 (ii), Cuthbert Hurd was a driving force behind the development of IBM Defense Calculator, who had a Ph.D. degree, had been head of technical research at the Oak Ridge National Lab and then founded IBM’s Applied Science Department, and later became director of IBM’s Electronic Data Processing Machines Division.

Therefore, co-launching a new computer company with Hurd, Howard Aiken would be collaborating with the best of IBM managerial talent in applied science and in electronic computer development, and also the best of Lockheed Missiles engineering talent as in the “assistant director of engineering” already doing computer design work for this potential new venture.

That might be the case, but Aiken likely did not regard it as much: in the last quote earlier from his 1973 interview: when the interviewer Henry Tropp mentioned the Defense Calculator Aiken did not even respond to it let alone mention Hurd, only crediting Thomas Watson, Jr. for turning around IBM’s fortune.

Howard Aiken’s sense of self-importance was brimming beyond naming his new company after himself. Later in his 1973 interview, he cited the example of how busy he had been as president of Howard Aiken Industries, Inc., not having the time to personally accept a prestigious scientific award, at the Franklin Institute, after attending dinner – presumably the award dinner – and later never taking the time to view that award, or to wear any of his awards:

“TROPP:

Have you ever had occasion to wear all those awards?

AIKEN:

I’ve never had occasion to wear one of them.

TROPP:

This looks like the Franklin Medal.

AIKEN:

That is the Edison Medal of the Institute of Electrical Engineers. I got an award from the Franklin Institute.

TROPP:

Yes, it’s listed here.

AIKEN:

That’s in the Harvard Archives.

TROPP:

It’s the John Price Award.

AIKEN:

Yes. The night I was to go and get that award, I also had to fly away to Madrid, and so I went to St. Louis and stayed for a short time at the Franklin Institute, and then out to the airport, and my plane that I had as President of Aiken Industries at the time, flew to Boston to meet TWA to go on to Madrid. I changed out of my dinner clothes on the plane. We left Philadelphia in the aero commander after TWA left New York, and we got there and so I made the flight. So Tony Ottinger picked up the Franklin Institute Award in my place and took it to Harvard and I’ve never seen it myself.”

(February 26-27, 1973, Smithsonian National Museum of American History)

As told in the quote above, Aiken visited the Franklin Institute in Philadelphia briefly for an award event, and most likely attended a formal dinner in “dinner clothes”, but did not have the time to accept the award, instead hurrying in his Aiken Industries’ ‘presidential plane’ to catch a TWA flight to Madrid, Spain; his former Harvard Ph.D. student Tony Oettinger, previously mentioned in quotes in Part 5 (ii), accepted the award on his behalf and took it to Harvard, and Aiken never bothered to view it afterwards.

The John Price Award Aiken was given was not the most well-known award of the Franklin Institute, which would be the Franklin Medal mentioned in the above quote. In the year 1964 when Aiken was one of four co-recipients of the John Price Award, also known as the John Price Wetherill Medal, the Franklin Medal was awarded to Gregory Breit.

(“John Price Wetherill Medal”, and, “Franklin Medal”, Wikipedia)

The physicist Gregory Breit led the early stage of atomic bomb development during World War II, resigning in 1942 with the role later taken over by the physicist Robert Oppenheimer, under whose leadership the Manhattan Project was successful, and two of the bombs were then used against Japan in war.

(“Gregory Breit”, and, “J. Robert Oppenheimer”, Atomic Heritage Foundation)

In my review in Part 5 (i) it has been noted that John von Neumann and the physicist Enrico Fermi, both important members of the Manhattan Project, died of cancer at the same age of 53, whereas Oppenheimer later died “10 years older at the age of 63”. But a more careful inspection shows that Oppenheimer was about 2 months short of his 63th birthday of April 22 when he died on February 18, 1967.

Though it was not “10 years older”, I note that Oppenheimer’s death was precisely 10 years and 10 days past von Neumann’s death on February 8, 1957.

(“John von Neumann”, Atomic Heritage Foundation)

And there were other interesting numbers of coincidences: the early computer pioneer Howard Aiken’s age of 73 at death was 20 years more than “father of computers” John von Neumann’s age of 53 at death; and the early atomic bomb development leader Gregory Breit’s age of 82 at his death on September 13, 1981, having been born on July 14, 1899, was also 20 years more than “father of the atomic bomb” Robert Oppenheimer’s age of 62 at death.

Considering that the other co-recipients of the 1964 John Price Award, John Eugene Gunzler, John Kenneth Hulm and Bernd T. Matthias, were all physicists awarded for achievements in the research frontier of “superconductive materials”, Howard Aiken was in good company indeed for an honor in memory of “America’s first scientist, Benjamin Franklin”.

(“Dr. Bernd T. Matthias to be honored by Franklin Institute”, October 7, 1964, UC San Diego, and, “JOHN EUGENE GUNZLER”, “JOHN KENNETH HULM”, and, “MISSION & HISTORY”, The Franklin Institute)

But the ambitious Howard Aiken probably did not view it that way.

In Part 5 (ii) I have offered the opinion that Aiken should have been given the Association for Computing Machinery’s A. M. Turing Award – considered the highest honor in computer science – had he not changed his career from academia to business.

However, in his 1973 interview shortly before his unexpected death, Aiken stated that from the beginning he had opposed the establishment of an organization like the ACM, that von Neumann had agreed with him, and that he now still opposed it:

“COHEN:

The other person that I was interested in, I know that you had at least two contacts with him, but for reasons that are fairly obvious, I think it would be fascinating to know what your relations with him were, when you first heard of him, was von Neumann. Now I know that Warren Weaver sent you some computations for von Neumann for the Mark I on inclusion. Secondly, you were with him on that National Academy of Sciences Commission. I have no idea what kind of relations you might have had or where you first heard of him.

AIKEN:

Well, let’s take the Commission first. Who all was in that Committee, let’s see.

TROPP:

There was you and von Neumann and Stibitz. Was Sam Alexander on that?

AIKEN:

Maybe. But there was a mathematician and what was his name? He’s presently at Miami University.

TROPP:

Oh, John Curtiss.

AIKEN:

That’s right— John Curtiss. John Curtiss proposed that we should all get together and start an association for people interested in computing machines, a new scientific society. And I said, “No, we shouldn’t do that because computation was a universal thing.” I said that what we should do was to help the mathematical economists to publish papers in their journals, using computational techniques and astronomers with theirs and the physicists with theirs and so on. That our best interests and the best interests of the scientific community as a whole would be better served to assist everybody to use machinery and to publish their work. And after all, we were tool makers, and it didn’t make very good sense for all of us specialists to get together and talk about the tools they used.

Von Neumann agreed with that completely, and so this proposal of Curtis’s’ was voted down. Curtiss then went out and formed the Association for Computing Machinery.

TROPP:

That was the Eastern Association, the original one.

AIKEN:

Yes. And neither John von Neumann during his lifetime, nor I have ever joined the Association for Computing Machinery. We opposed it and I still oppose it. Now, the basis for the opposition today is, of course, far less, because now you get all the machine builders and there are thousands and thousands of specialists, and they do have something going for them. But as of the time that it was proposed, it didn’t make very good sense, and as I say, von Neumann refused to have anything to do with it and I’m just as stubborn as von Neumann, and I never joined.

COHEN:

Did he ever come up to your Laboratory?

AIKEN:

Yes, many, many times. We did several problems for him. Then, we had a lot going back and forth. As long as he was using computing machines and helping to lead us in the discovery of numerical methods, which in some of the problems that we did, Dick Block did the programming by the way…”

(February 26-27, 1973, Smithsonian National Museum of American History)

Oh well, in that case, there was no reason for the ACM to award Aiken the Turing Award – when Aiken himself did not even see merits for ACM’s existence.

In this context, I begin to wonder why the Turing Award to Alfred Brooks, founding chairman of the University of Carolina at Chapel Hill’s computer science department and “one of Aiken’s most devoted disciples” as in Part 5 (ii), was given to Brooks rather late, in 1999, when the most important of his achievements had been made at IBM in the early 1960s as the leader of its development team for the important and successful System/360 computer.

I notice that 1999 was 2 years after 1997, the year John Weber Carr, III died of cancer as in Part 5 (ii), who had in 1956 become the first current academic – a University of Michigan mathematics associate professor – to serve as ACM president, and in 1959 become UNC Chapel Hill’s computation center director, but then left in 1962 before UNC Chapel Hill invited Brooks from IBM in 1964 to found what was the second academic computer science department in the U.S.

I cannot confidently assert that Carr opposed awarding ACM’s Turing Award to Brooks, and that Brooks could not get it due to Carr’s statue and influence as a former ACM president until after Carr’s death; but given how opinionated and stubborn some of these personalities were, Aiken in particular as extensively illustrated, I would not be surprised if this scenario was indeed the case.

A consequence of such egotism on Howard Aiken’s part was that, in the scenario when he and Cuthbert Hurd considered launching a computer company together in the early 1960s, if he viewed himself like William Shockley and treated Hurd like a disciple, or if he viewed himself like Sherman Fairchild and treated Hurd like only a manager under him, Hurd likely would not agree and would not accept less than a real share of the ownership.

The lack of ownership equity had in fact been an important reason why in 1956-1957 the 8 disciples of William Shockley mutinied, left Shockley Semiconductor and founded Fairchild Semiconductor with financing from Sherman Fairchild. Besides their opposition to Shockley’s difficult management style and orthodox scientific focus as previously discussed, they aspired for higher compensations, including not only salaries but stock options, for their work; and in making Fairchild Semiconductor a success, the 8 started the trend of venture capital financing for technological start-ups founded by Silicon Valley scientists and engineers:

“When William Shockley founded his company in Stanford University’s Research Park in 1955, no one knew then that he was starting an industry that was to give a whole region its name: Silicon Valley. Shockley chose Palo Alto as the site for his company partly because it was where he grew up and his mother still lived there, partly because he was aware that entrepreneurial electronics companies were hatching there, and partly because Arnold Beckman, his financial backer and founder of Beckman Instruments, had located one of his divisions in the Stanford Research Park.

One of Shockley’s motivations for starting his transistor company in 1955 was his conclusion that “the most creative people were not adequately rewarded as employees in industry.” Shockley attracted to his transistor company the brightest and best young men, who formed the nucleus of entrepreneurial scientists and engineers that built the semiconductor industry in Silicon Valley. But they didn’t make their fortunes at Shockley’s company. Wealth came later when they started and built their own companies.

In 1957, eight of them left to found Fairchild Semiconductor. Robert Noyce was one of them. According to Noyce, one of their principal reasons for leaving was that they could get equity in a startup company rather than simply working for a salary for the rest of their lives. They weren’t disappointed. Seven years later, each of the eight received about $250,000 when Fairchild Semiconductor was bought out by its parent, Fairchild Instrument and Camera—not a shabby return on their original investments of $500 each. …

When the Shockley Eight launched the first company to focus exclusively on silicon devices (rather than those of germanium), they were financed by Sherman Fairchild. At the time he was the largest individual stockholder of IBM through stock inherited from his father who was one of IBM’s founders. Sherman owned Fairchild Instrument and Camera, which set up the Shockley Eight as Fairchild Semiconductor. Venture capitalist Arthur Rock helped arrange the financing.

Thus, we see a pattern emerging at the start of the semiconductor industry in Silicon Valley: a scientific breakthrough followed by commercial exploitation by entrepreneurial scientists and engineers financed with venture capital from technologically savvy, wealthy investors. …”

(William D. Bygrave and Jeffry A. Timmons, Venture Capital at the Crossroads, 1992, Harvard Business School Press)

As quoted, each of the Shockley Eight’s $500 initial investment in Fairchild Semiconductor became $250,000 when Sherman Fairchild bought the ownership 7 years later.

It is not known much equity Howard Aiken would have been willing to give Cuthbert Hurd in a new computer company had it gotten off the ground in the early 1960s. But I can see Hurd wanting a larger share than any one of the 8 scientists starting Fairchild Semiconductor, given that Hurd already had a distinguished record as an IBM executive, and that Aiken himself could not provide the level of financing Sherman Fairchild had provided.

But as reviewed earlier, when Fairchild Semiconductor was founded in 1957 Sherman Fairchild had reserved the option of buying its full ownership. 7 years later his paying a pre-agreed $3 million – a figure cited earlier – to the company, with each of the 8 young founders receiving $250,000, materialized that objective, and from this point on neither the 8 nor anyone else other than Fairchild’s parent company owned shares.

In 1968 two of the 8, Robert Noyce and Gordon Moore, decided to leave Fairchild Semiconductor to form their own company. The money they had made from the ownership buyout and the experiences they had gained at Fairchild Semiconductor positioned them to launch the next big, more independent venture, the Intel Corporation; and their move spurred the next big wave of venture capital-financed start-ups in Silicon Valley, many of which, Intel the most famous, became known as “fairchildren” – companies founded by persons who had worked at Fairchild Semiconductor:

Intel: The Fairest of the “Fairchildren”

There can be little doubt that Fairchild was the breeding ground for the technology entrepreneurs—sometimes dubbed “Fairchildren”—who built the semiconductor industry in Silicon Valley. About half the firms can trace their roots in Fairchild. It is a distinguished list of movers and shapers, among them, Intel, National Semiconductor, and Advanced Micro Devices (AMD). But the fairest of them all is Intel.

In the summer of 1959, Noyce, then director of R&D at Fairchild, invented the integrated circuit. (Jack Kilby at Texas Instruments independently discovered the same concept a few months earlier. Today, Kilby and Noyce are recognized as the co-inventors of the integrated circuit.) The importance of the integrated circuit to the development of the semiconductor industry was second only to the invention of the transistor itself. So Noyce was already a legendary figure when he and Gordon Moore resigned from Fairchild in 1968.

Noyce and Moore, with Rock as their venture capitalist, launched their next semiconductor company, Intel. Rock as lead investor raised $2.5 million and Noyce and Moore each invested about $250,000. Through years of tireless endeavor, they multiplied their original investments of $500 in Fairchild a hundred-thousand-fold. In 1982, Moore owned 9.6% of Intel, with a market value of more than $100 million, and Noyce owned 3.6%. No one deserved it more. They had been at the forefront of building a new industry. Their companies were responsible for breakthroughs that transformed not only the semiconductor industry but society itself.

Before Rock launched Intel, there had been only a handful of venture-capital-backed startups. But that was about to change. Other budding entrepreneurs were making proposals to other venture capitalists. Between 1967 and 1972, about thirty companies were started with venture capital … including such luminaries as National Semiconductor—which was started the year before Intel—and Advanced Micro Devices.”

(William D. Bygrave and Jeffry A. Timmons, 1992, Harvard Business School Press)

In addition to National Semiconductor, Intel and AMD mentioned above, one of the “Fairchildren” of note was Signetics, which as mentioned earlier designed the computer chips for the Apollo moon landing program’s emergency onboard Abort Guidance System.

(“NXP Semiconductors”, 2008, Silicon Valley Historical Association)

I note that when starting Intel, Noyce and Moore reinvested the same amount of money they had made from their Fairchild Semiconductor equity – $250,000 each, after 7 years from a $500 initial investment – into the new start-up, and 14 years later in 1982 Noyce owned 3.6% of Intel, and Moore 9.6%. With Moore’s valued at more than $100 million, Noyce’s would be more than $37.5 million.

In the corporate culture of the earlier time back in 1957, when the Shockley Eight sought investment for their new venture, not only that their rebellion against their Nobel laureate mentor made them unpalatable to the investment community, but that their request for equity and management power was also considered inappropriate:

“… Some firms may have found the pith of the letter–please give a million dollars to a group of men between the ages of 28 and 32 who think they are great and cannot abide working for a Nobel Prize winner–unpalatable. Even if a firm thought the proposal was interesting in theory, no standard operating procedure existed for the company-within-a-company undertaking Rock and Coyle recommended. What accounting procedures would be used? How could the funding firm allow this group of unknown young men to run their own operation, according to criteria of their own devising, and not permit other employees the same autonomy? In the 1950s, with its ethos of conformity, this smacked of unseemly preferential treatment.”

(Leslie Berlin, 2005, Oxford University Press)

“Company-within-a-company undertaking”? In my understanding, if it had been a group of former corporate managers requesting the same kind of power in a new company then it would have been alright in the eyes of an investment firm, because they knew the accounting procedures, they were not unknown to the investment firm, and allowing them but not other employees to run the company was simply a matter of the management right. The problem was, therefore, that scientists and engineers with special technological expertise were still viewed only as employees.

Ironically, such prevailing corporate view of the 1950s was compatible with Cuthbert Hurd’s interest in co-launching a company with Howard Aiken: Hurd had been an established IBM executive; and if he also helped “raise the money” then he probably would not think of Aiken’s computer pioneer status and expertise as worthy of a Sherman Fairchild level of ownership power.

In both the investment firm’s view in the Shockley Eight case and Hurd’s view in the Aiken case, they also had the politically correct ground: for the investment firm, the “ethos of conformity” did not permit “unseemly preferential treatment” for some employees but not for others; and for Hurd, Aiken’s “thinking about starting the company” in order “to be rich” was not a deserving motivation.

But the issue of fairness aside, namely to bright scientists like the Shockley Eight, and to the prominent computer pioneer Howard Aiken aspiring to succeed in business, there is still a question of practicality regarding a company run by scientists and engineers with special technological expertise: would such management be more beneficial to the other employees of the company?

The history of Fairchild Semiconductor seemed to suggest that the answer should be affirmative in regards to the company’s commercial success. The Shockley Eight started Fairchild Semiconductor not only with them as part-owners but also with some at the helm of the management, and it turned out good enough that 7 years later Sherman Fairchild exercised his $3 million buyout option to acquire the company.

In fact, Fairchild Semiconductor was very successful in the civilian commercial market sectors, such as the television market, and in the 1960s became the U.S. market leader in integrated circuits – something the company had invented independently along with Texas Instruments as discussed earlier – with a market share of 55%:

“Responding to a decline in the military demand for electronic components in the early 1960s, Fairchild Semiconductor created new markets for its transistors and integrated circuits in the commercial sector. To meet the price and volume requirements of commercial users, Fairchild’s engineers introduced mass production techniques adapted from the electrical and automotive industries and set up plants in low labor cost areas such as Hong Kong and South Korea. The firm’s application laboratory also developed novel systems such as an all-solid state television set and gave these designs at no cost to its customers, thereby seeding a market for its products. … By 1966, Fairchild had established itself as a mass producer of integrated circuits and controlled 55% of the market for such devices in the United States.”

(“Technology and Entrepreneurship in Silicon Valley”, by Christophe Lécuyer, December 3, 2001, Nobelprize.org)

But to really answer the question posed earlier, one needs to determine how much Fairchild Semiconductor’s commercial success under the leadership of the founding scientists group translated into improved incomes for the regular employees.

It is unclear where the rest of the $3 million Sherman Fairchild’s parent company paid to Fairchild Semiconductor was distributed, i.e., other than the $2 million paid to the 8 founders. After that, the company was in the hands of Sherman Fairchild. From this perspective, if one considers better employee compensations as not only earning wages but also gaining ownership equity, i.e., stock options, then after the first 7 years no employee owned anymore equity at Fairchild Semiconductor.

And that was an important rationale behind the decision by Robert Noyce and Gordon Moore to leave Fairchild Semiconductor in 1968 and found Intel, a new company where they would institute a much more egalitarian corporate structure and culture, in which not only they would not be working for a wealthy businessman in the end, but also every employee could own equity:

“In preparation for the IPO, Noyce consulted with attorneys and bankers, reviewed drafts of the prospectus, met with auditors, signed the certificates necessary for the offering, wrote explanatory letters to employees and current investors, invited employees to buy stock in the offering as “friends of the company,” …

SEC rules for public offerings required Intel to cancel the stock-purchase plan set up at the company’s establishment. Noyce dreamed of replacing this plan with options packages that would be distributed to every employee, “including janitors.” He worried, though, if people with limited educations could understand what a stock option was and how volatile the market could be. He and Moore finally decided that once a plan could be developed that met SEC guidelines for publicly held companies, Intel should reinstitute a stock-purchase plan, rather than options, for nonprofessional employees. Under this plan, which was implemented in 1972, every employee would be allowed to take up to 10 percent of base pay in Intel stock, which could be bought at 15 percent below market rates. The stock purchase plan met Noyce and Moore’s goals of giving employees a stake in the company without requiring the sophisticated financial knowledge associated with stock options.”

(Leslie Berlin, 2005, Oxford University Press)

The nonprofessional employees of Intel, “including Janitors” as quoted above, were allowed to use up to 10% of their wage to buy company stocks at 15% below market prices – as quoted earlier Intel earned the reputation of being “the fairest of the “Fairchildren””.

Intel’s egalitarian culture influenced, and spread to, the entire Silicon Valley, according to Robert Noyce’s widow Ann Bowers speaking in a February 2013 event remembering the pioneers of Silicon Valley; Bowers herself also brought some of that culture to Apple Computer:

“Ann Bowers knew Bob Noyce, the “mayor of Silicon Valley,” better than anyone. She was married to the co-founder of Fairchild Semiconductor and Intel, and witnessed the magnetic effect he had on the people who followed him to the region that will be chronicled in an American Experience history documentary on PBS on Tuesday (Feb. 5) at 8 pm. The film is called Silicon Valley: Where the Future Was Born, and it captures the people like Noyce, who died in 1990, and how they made such a mark that their impact is still reverberating today.

Last week, Bowers … spoke about Noyce on stage at the Computer History Museum in Mountain View, Calif., where the surviving pioneers of Silicon Valley gathered to celebrate the film and the man at the center of it. …

Noyce isn’t depicted as some superhero, since he had his flaws. But, like David Packard and Bill Hewlett before him, he was a very big part of the fuel that set the valley on fire. …

At the start of the event, host Hari Sreenivasan asked Bowers about the many talents of Noyce. He played the oboe. He was the state diving champion. He lettered on the swim team. He was in the drama club. And he knew more about transistors than just about anyone on the planet. Bowers smiled and answered a quick “yes” to each one of the facts.

Noyce had migrated to the Santa Clara Valley to work for William Shockley, the Nobel-prize winning physicist who co-invented the transistor and moved West to Palo Alto, Calif., to set up Shockley Semiconductor Laboratory — and be near his mother.

But Noyce, Gordon Moore, and six other colleagues broke away, since they didn’t like his erratic behavior. They were deemed “the traitorous eight” … by Shockley. They founded Fairchild Semiconductor, a division of Fairchild Camera and Instrument.

In 1957, the Russians scared America with the launch of the first satellite, Sputnik. President Eisenhower created NASA within the next year and launched the American space program.

Noyce’s team at Fairchild was positioned to make the chips that would go into the rockets and space ships. The federal government had an “insatiable need” for what Fairchild would produce, said Leslie Berlin, author of The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley.

In contrast to Shockley’s dictatorship and the East Coast profit-milking of Fairchild’s parent company, Noyce and co-founder Gordon Moore founded Intel in 1968 and gave out stock options. They managed in a way that was more egalitarian, building a company based on meritocracy. Bowers gave credit for that culture to Moore as well as Noyce. They infused this egalitarianism in the rest of the Valley, which came to be a collection of many companies that spun out of Fairchild. Today’s valley is littered with descendants, or Fairchildren, such as Advanced Micro Devices.

Noyce once said that the stock option money didn’t seem real.

“It’s just a way of keeping score,” he said.

The Pied Piper of his generation, Noyce counseled his employees to “go off and do something wonderful.”

Bowers held jobs such as director of personnel for Intel, and she was the first vice president of  human resources for Apple. She currently serves as the chair of the Noyce Foundation.”

(“Widow speaks about Bob Noyce, telling the human side of the mayor of Silicon Valley (video)”, by Dean Takahashi, February 4, 2013, Venture Beat)

The history view of Silicon Valley as presented by Ann Bowers cited above says that William Shockley’s management, which the “Traitorous Eight” rebelled against, was a “dictatorship, and subsequently the Sherman Fairchild parent company’s oversight of Fairchild Semiconductor founded by the 8 was “East Coast profit-milking”.

Bowers’s view is supported by Arthur Rock, a venture capitalist instrumental in helping to start both Fairchild Semiconductor and Intel as mentioned in earlier quotes, who in a Harvard Business School interview expressed a similar opinion about East Coast business style:

““… Fairchild Camera and Instrument was really an eastern-type company; they wanted to run things their way. That’s why people then left Fairchild Semiconductor and formed Intel and other companies. …”

(“ARTHUR ROCK”, Entrepreneurs, Harvard Business School)

Rock explained what “eastern-type” business was like, that it was “old establishment and old money”:

“The problem at Fairchild Semiconductor had to do with incentives. The whole idea of giving people incentives was something foreign to most companies. That’s one of the reasons, of course, that I came out to California; I saw that people were a lot more adventuresome in California than they were in the East. In the East, it’s the old establishment and old money. People have been doing things one way for a long time and it’s very hard to change. It actually took many, many years for them to change, whereas people who came out West had some of the “Go West, young man,” Wild West aura about them, and they were willing to do things to test new ideas. I found that really the brighter, more imaginative, adventuresome people were out here rather than in the East.”

(Entrepreneurs, Harvard Business School)

As quoted, Rock contrasted the East Coast’s “old establishment and old money” with Californians’ “adventuresome” spirit.

In this Harvard Business School interview, Arthur Rock recalled how he met the Shockley Eight and decided to help, stating that starting a new company was actually his idea, except that he and his firm then approached some 35 investment firms and none would help, before he was introduced to Sherman Fairchild:

“I had done financings for a number of small companies based around New York and Boston, and I liked doing that. I liked the people who ran those companies. I liked the scientific-type person. So Eugene Kliner wrote a letter to his father’s broker who was at Hayden Stone. And that broker, knowing of my interest in these kinds of companies, showed me the letter.

… seven of whom got together and asked Eugene Kliner to write this letter. Actually, the letter was typed by Kliner’s wife. The letter said that they were unhappy with Shockley and they were going to quit, but did we know of anyone who might hire them together as a group? They weren’t looking especially to form a company. That was my idea. I came out to see them with one of the partners of Hayden Stone and we talked to them. They had all been chosen by Shockley, so I knew they were probably pretty good people, and then when we met them I was very impressed and thought we could help them. I suggested to them that they might want to set up a company, and we told them we would see if we could get financing for the company. We had a couple more meetings with them. Then they brought along an eighth fellow who was Bob Noyce, so that was the “traitorous eight.”

We, Hayden Stone together with the eight of them, put together a list of companies that might finance this group. We had about thirty-five companies, all of whom had expressed an interest in going into new fields. We talked to each of these thirty-five companies. I personally visited most of them. And their reply was, well, this is a great idea, but if you set up a separate company, then what will our other employees think; it will just upset our organization, so we don’t want to do it. So we crossed out all thirty-five and were at our wit’s end when somebody introduced me to Sherman Fairchild.”

(Entrepreneurs, Harvard Business School)

Rock pointed out that Fairchild had wealth, an inventor background, and a liking for young people, that were favorable factors for his decision to finance the 8 scientists to start the new company:

“Sherman Fairchild was one of the largest shareholders of IBM stock. His father had gone into partnership with Tom Watson in setting up IBM, and since Tom Watson’s family consisted of three or four children and Sherman Fairchild was the only heir in his family, he owned a lot of stock. So he had a lot of cash and he was an inventor. He had invented the aerial camera, and as a result had formed Fairchild Camera and Instrument Company. And then he had to invent the airplane that could hold the camera, so that was the basis for Fairchild Aviation. He also liked young people and saw the merit in our idea, so he agreed that Fairchild Camera and Instrument Company would finance this group. They advanced what we then named Fairchild Semiconductor a million and a half dollars in return for an option to buy all of our stock for $3 million. We then split up Fairchild Semiconductor: 10 percent to each of the eight and 20 percent to Hayden Stone.”

(“ARTHUR ROCK”, Entrepreneurs, Harvard Business School)

As recalled by Arthur Rock, Sherman Fairchild’s company provided $1.5 million financing on the condition that the parent company had the future option to buy the entire Fairchild Semiconductor for $3 million, but otherwise did not hold any initial ownership at all: 80% of the initial ownership belonged to the Shockley Eight at 10% each, and the other 20% belonged to Rock’s investment company Hayden Stone.

So it was a bold start-up arrangement, that the 8 actually owned 80% of Fairchild Semiconductor from the start, except that the company was obliged to be, and later indeed was bought out by the East-Coast parent company.

Rock offered the reflection that had he and Sherman Fairchild not helped the Shockley Eight, “there would not have been any silicon in Silicon Valley” and “a lot of them would have ended up with Texas Instruments”:

“… But there would not have been any silicon in Silicon Valley if it hadn’t been for the formation of Fairchild Semiconductor because the “treacherous eight” would probably have just and gone off and gotten jobs individually. My guess is a lot of them would have ended up with Texas Instruments, which had a similar but less successful enterprise going in Texas.”

(Entrepreneurs, Harvard Business School)

As discussed earlier, Texas Instruments was Fairchild Semiconductor’s main competitor in integrated circuit invention, and did much better in the direct supply of U.S. military aerospace demands.

While the U.S. civilian space program and NASA, started in 1957-1958 by President Dwight Eisenhower as quoted earlier, fuelled the demand for semiconductors, the rapid growth of consumer electronics would drive the market to much larger. As cited earlier, Fairchild Semiconductor played a significant role in the 1960s developing the U.S. civilian and commercial market sectors for integrated circuits and consumer electronics.

During this period, the rapid growth of consumer electronics far outpaced the growth in military demands, and permanently ended the U.S. military’s dominance in the semiconductor market, reducing the military share from, for instance, 72% of annual semiconductor sales in 1965 to just 21% in 1970:

“Prior to the outpouring of new firms and new products, semiconductors were part of the defense industry. By the end of the 1960s, consumer and industrial applications of microelectronics were expanding much more rapidly than DoD purchases. Defense sales grew relatively slowly, and quickly dropped below 20 percent of the total (Table 8-1). …

…”

(John A. Alic, Lewis M. Branscomb, Harvey Brooks, Ashton B. Carter and Gerald J. Epstein, Beyond Spinoff: Military and Commercial Technologies in a Changing World, 1992, Harvard Business School Press)

Howard Aiken’s retirement from the academia and adventure in business, mentioned earlier and discussed in some details in Part 5 (ii), took place during this same period of the 1960s and early 1970s.

Upon his early retirement from Harvard in 1961, Aiken started Howard Aiken Industries, specialising in buying, fixing and selling companies. He and Cuthbert Hurd also discussed a plan of starting a new computer company that would have been the first in the world to develop computers of smaller sizes referred to as “microcomputers”, and a Lockheed assistant director of engineering was doing the design work; but they did not follow through with the plan, with Hurd later commenting somewhat scornfully about Aiken’s motivation, as quoted earlier: “I thought that maybe he wanted to be rich,”, “and was thinking about starting the company for that reason”.

Then in 1967 Aiken retired from heading his company, became its board vice chairman – his Aiken Industries was at some point renamed Norlin Technologies as cited in Part 5 (ii) –and re-engaged in active consulting, now for Monsanto Company in addition to Lockheed Missiles and Space Company, as previously quoted in Part 5 (ii),:

“During the 1973 interview, Hank Tropp questioned Aiken about aspects of his life and career after leaving Harvard. Aiken referred, first of all, to his “forming Aiken Industries, beginning in 1961” and his becoming “vice-chairman of the board” in 1967. “So now,” Aiken said, “I go to board meetings, but I’m not going at it the way I used to. . . . When they kicked me out of Harvard, I had to find a new job and that was Aiken Industries. And when they kicked me out, I had to find a new job and went into the consulting business. So now I spend a good deal of time at Monsanto.” … Aiken said that he had been a consultant at Lockheed “for many years,” but that he had “quit that this year.””

(I. Bernard Cohen, 2000, The MIT Press)

What is especially interesting but not yet reviewed here is Aiken’s work during the 1960s and early 1970s in the context of the field of computer development of that period.

As mentioned earlier, in 1959 while Aiken was a Harvard professor and a consultant for Lockheed, the company had started the Lockheed Electronics Company in New Jersey, away from California where Aiken did consulting on switching circuits for missiles and space applications.

Just 2 years after Aiken’s return to active computer consulting, in 1969 the Lockheed Electronics Company produced its first commercial computer, a “minicomputer”:

“Lockheed Aircraft can trace its history back to 1912. It entered the computer field in 1969 when it produced a small process control computer, the MAC-16 (also referred to as the LEC-16). The MAC was used for instrument control in large laboratory settings and for applications such as air traffic control.

The Lockheed Data Products Division supported the MAC (and its successors, MAC Jr., Sue, and System III) until the mid 1970s when Lockheed decided that they should concentrate on their core capabilities and left the commercial computer business.”

(“Company: Lockheed Electronics Company, Inc.”, Computer History Museum)

So, without Howard Aiken’s consultancy participation Lockheed got into the computer development field and produced a “minicomputer” for the commercial market, one that was useful in laboratory settings.

Minicomputers were smaller than the typical mainframe computers up to that point in time, such as the various IBM systems mentioned before, but not as small as what a “microcomputer” is today, which is typically a PC, i.e., personal computer.

(“Supercomputers, Mainframes, Minicomputers and Microcomputers: Oh My!”, December 14, 2011, Turbosoft)

It was at this stage that in 1970 Howard Aiken again discussed with Cuthbert Hurd about launching a computer company to make smaller computers, this time with “the idea of what was to become a microprocessor and personal computer”, quoted previously in Part 5 (ii):

“Hurd… recalled that he, Aiken and William Main had met several times in 1970 “to discuss the formation of a corporation to be called PANDATA.” The three had “discussed the idea of what was to become a microprocessor and personal computer.” Aiken, according to Hurd, “had an early vision of the usefulness of such devices,” “believed that they could be mass produced at a low cost,” and “wished to form an integrated company to manufacture and sell them.” Aiken wanted Hurd “to help form the company, be chairman of the board, and raise the money.” Aiken himself “wished to make a considerable investment in the new company.” …”

(I. Bernard Cohen, 2000, The MIT Press)

I note that the first time, probably around 1961, when Aiken and Hurd discussed about starting the “first microcomputer computer company in the world” as quoted earlier, there was no mention of “microprocessor”, and then in 1970 when they again discussed starting a computer company, “microprocessor and personal computer” were in Aiken’s vision.

That is interesting, because the modern definition of a microcomputer, like a personal computer, includes that of a microprocessor as its central processing unit:

Microcomputer, an electronic device with a microprocessor as its central processing unit (CPU). Microcomputer was formerly a commonly used term for personal computers, particularly any of a class of small digital computers whose CPU is contained on a single integrated semiconductor chip. … Smaller microcomputers first marketed in the 1970s contain a single chip on which all CPU, memory, and interface circuits are integrated.”

(“Microcomputer”, Encyclopædia Britannica)

Aiken’s “early vision”, as Hurd called it, was quite impressive. They had planned to start a “microcomputer computer company” probably a decade before small microcomputers were “first produced marketed in the 1970s”, as quoted above.

But to start the world’s first “microcomputer computer company” so early, that company would have to produce a small computer without relying on a microprocessor – despite the modern technical definition quoted above requiring a microprocessor for a “microcomputer” – because, even by 1970 when they talked about starting a company to mass-produce “microprocessor and personal computer”, the microprocessor was not yet invented.

Around that time in 1970, something like the microprocessor was only in the process of being invented, by none other than Intel, the Silicon Valley semiconductor company founded 2 years earlier by two of the Shockley Eight, Robert Noyce and Gordon Moore.

More specifically, in 1969-1970 Intel was doing a project for an advanced Japanese calculator design, and in that process Intel chip designer Marcian “Ted” Hoff proposed a new design to consolidate several central components onto a single chip, as Hoff later recalled:

“Intel was founded with the idea of doing semiconductor memory. Up until that time, most computer memory used small magnetic cores strung onto wire arrays. In most cases, they were wired by hand and some of these cores weren’t much bigger than the tip of a mechanical pencil. … While developing memory products, there was a feeling within Intel’s management that it might take a while before the computer industry would accept semiconductor memory as an alternative to cores. So it was felt that we should undertake some custom work, that is, to build chips to the specifications of a particular customer. We were contacted by a Japanese calculator company whose calculators came out under the name Busicom. They said that they would like to have us build a family of chips for a whole series of different calculator models, models that would vary in type of display, whether they had a printer or not, the amount of memory that they had and so on. A contract to make their chips was signed in April of 1969. … I was curious about the calculator design. I knew little about it, although I was fairly familiar with computer architectures and I had been at the sessions where the project had been discussed. The more I studied the design, the more concerned I became, based on what I had learned about Intel’s design and package capability and costs. It looked like it might be tough to meet the cost targets that had been set in April. The Japanese design was programmable, using read-only memory, but it seemed to me that the level of sophistication of their instruction set was too high, because there was a lot of random logic and many interconnections between different chips. There was a special chip to interface a keyboard, another chip for a multiplexed display and yet another chip for one of those little drum printers. It seemed to me that improvements could be made by simplifying the instruction set and then moving more of the capability into the read-only memory, perhaps by improving the subroutine capability of the instruction set. I mentioned some of my concerns and ideas to Bob Noyce. He was really encouraging, saying that if I had any ideas to pursue them because it was always nice to have a backup design. I did so throughout the months of July and August. … In October, the management of the calculator company came over to the U.S. for a meeting in which both approaches were presented. At that point they said they liked the Intel approach because it represented a more general purpose instruction set, a simpler processor, fewer chips to be developed and had a broader range of applications. Our proposal reduced the number of chips needed from around a dozen to only four.”

(“Ted Hoff: the birth of the microprocessor and beyond: Alumni Profile”, Alumni Profile, Stanford Engineering)

As Ted Hoff explained above, by using his new design the Japanese calculator needed only 4 semiconductor chips instead of “around a dozen” chips in the Japanese design, and – as he implied – the number of “interconnections”, i.e., wirings between the chips, would also be reduced as a result.

Subsequently in 1970-1971 Intel turned this new idea, of using a single chip to integrate different functionalities previously spread over several chips, into a general-purpose microprocessor for computers:

“Our initial goal was never to make a microprocessor, only to solve this particular customer’s problem, this calculator design problem. But there were several aspects of the design that became more evident as it was pursued. One was, being more general purpose and faster than the original design, we figured it might be useful for a broader range of applications than just the calculator family. Dr. Federico Faggin was hired around in April of 1970 and given the responsibility for chip circuit design and layout, to turn this architecture into a physical transistor layout. … He had working parts by around January of 1971.”

(Alumni Profile, Stanford Engineering)

As Intel’s Japanese calculator example illustrates, back in the 1960s when Howard Aiken and Cuthbert Hurd discussed starting the world’s first “microcomputer computer company”, making a small microcomputer without the microprocessor – unlike the modern definition of microcomputer – was possible but the technical work would have been more complicated, the computer’s size likely not as small and its performance likely not as good.

Then by the next time, in 1970, when the two discussed launching a computer company to develop “a microprocessor and personal computer”, Intel, a company that had started only in 1968 – one year after Aiken returned to active computer consulting – was already in the process of developing the first microprocessor.

Thus, in 1970 it was too late for a yet-to-be-founded company to invent the microprocessor.

The personal computer, on the other hand, did not become a reality even by the time of Aiken’s death in 1973. Hence a new company of Aiken’s and Hurd’s could have become the first to develop and mass-produce it.

As reviewed in Part 5 (ii), Aiken died in his sleep in a St. Louis hotel during a March 1973 consulting trip to Monsanto on memory technology – also a main interest of Intel’s – with the goal of making computers smaller; Aiken had just quit consulting for Lockheed Missiles and Space Company in Silicon Valley – presumably also abandoning the hope of starting a new company with Hurd and with help from Lockheed engineers.

Commenting on that missed prospect for Aiken to develop smaller computers, Hurd later said that Aiken died before it could happen, as previously quoted in Part 5 (ii):

“… Hurd reported, however, that he “was busy at the time with other activities” and that Aiken “died before the venture could be launched.””

(I. Bernard Cohen, 2000, The MIT Press)

But like I have remarked in Part 5 (ii), there were 3 years from 1970 t0 1973 for Hurd to act on their collaboration plan, and I would only think that Hurd had the sense to know well that at the age of over 70 – Howard Aiken was born in 1900 as in Part 5 (ii) – there simply wasn’t much working time left to waste if Aiken was to realize his ambition.

In the end, past Howard Aiken’s lifetime, the work of developing microcomputers and personal computers and the glory of making them a success went to a younger generation of scientists and engineers, most notably among them computer designer and Apple Computer co-founder Steve Wozniak:

Stephen Gary Wozniak, (born Aug. 11, 1950, San Jose, Calif., U.S.), American electronics engineer, cofounder, with Steven P. Jobs, of Apple Computer, and designer of the first commercially successful personal computer.”

(“Stephen Gary Wozniak”, by William L. Hosch, Encyclopædia Britannica)

Stephen Gary Wozniak also happened to be the son of a Lockheed Missiles and Space Company engineer:

“Wozniak—or “Woz,” as he was commonly known—was the son of an electrical engineer for the Lockheed Missiles and Space Company in Sunnyvale, Calif., in what would become known as Silicon Valley. …”

(William L. Hosch, Encyclopædia Britannica)

I note that in Part 5 (ii) a similar case of a generation-long time frame before a goal was finally realized has been noted: Alain Fournier, a University of British Columbia computer science professor specializing in computer graphics when I taught at UBC in the late 1980s and early 1990s, had in the 1970s been a Ph.D. student at the University of Texas at Dallas interested in studying computer graphics with faculty member Henry Fuchs, without realizing that wish as Fuchs soon left for the University of North Carolina at Chapel Hill; then at UBC in 1990, Fournier brought in new Stanford Ph.D. Jack Snoeyink to fill a tenure-track faculty position, and it practically ended my hope of getting a tenure-track position there; finally in 2000, as Fournier was dying of cancer, Snoeyink landed a UNC Chapel Hill professorship and became a colleague of Fuchs, the Federico Gil Distinguished Professor of Computer Science at UNC Chapel Hill, in the computer graphics field.

It turned out that even though Intel invented the microprocessor by 1971, i.e., while Aiken was still alive and had in 1970 talked with Hurd about developing it, it was not commercially available until after Aiken’s death; when the Intel 8080 microprocessor became available in 1975, Wozniak, a university dropout including from UC Berkeley in the San Francisco Bay Area, and a Hewlett-Packard computer designer, began designing a microcomputer he hoped HP would produce; but HP was not interested:

“… A precocious but undisciplined student with a gift for mathematics and an interest in electronics, he attended the University of Colorado at Boulder for one year (1968–69) before dropping out. Following his return to California, he attended a local community college and then the University of California, Berkeley. In 1971 Wozniak designed the “Blue Box,” a device for phreaking (hacking into the telephone network without paying for long-distance calls) that he and Jobs, a student at his old high school whom he met about this time, began selling to other students. Also during the early 1970s Wozniak worked at several small electronics firms in the San Francisco Bay area before obtaining a position with the Hewlett-Packard Company in 1975, by which time he had formally dropped out of Berkeley.

Wozniak also became involved with the Homebrew Computer Club, a San Francisco Bay area group centred around the Altair 8800 microcomputer do-it-yourself kit, which was based on one of the world’s first microprocessors, the Intel Corporation 8080, released in 1975. While working as an engineering intern at Hewlett-Packard, Wozniak designed his own microcomputer in 1976 using the new microprocessor, but the company was not interested in developing his design. …”

(William L. Hosch, Encyclopædia Britannica)

This is not the first example of Hewlett-Packard’s slow response to the prospect of developing new products. As discussed in Part 5 (i), 20 years earlier in 1956-1957 Silicon Valley computer pioneer Douglas Engelbart was looking for a job in computer development and HP let him know that it had no plan for such, and only after another 10 years in 1966 that HP started its own computer product line. Now another 10 years later in 1976, HP was again uninterested in something new, namely Wozniak’s personal computer design.

Wozniak later recalled that as an HP employee he had pitched his personal computer design to the company no fewer than 5 times and been rejected every time, before accepting his friend Steve Job’s suggestion to start their own personal computer company:

“His loyalty to Hewlett Packard made him reluctant to leave the company to start Apple with Jobs. Wozniak reminded reporters last week at the Computer History Museum that he had proposed his idea for the Apple I computer to Hewlett Packard, but they “turned him down 5 times.”

According to a 2008 interview with The Telegraph, Wozniak originally thought he owed it to HP to stay, until Jobs persuaded Wozniak’s family to convince him to do it.”

(“Apple co-founder offered first computer design to HP 5 times”, by Josh Ong, December 6, 2010, Apple Insider)

As quoted, Jobs needed to get Wozniak’s family to convince Wozniak to start Apple Computer with him.

Though a university dropout, Wozniak was a proven designer of video games, having worked with Jobs in the early stage of that field, including building his own version of the Pong game, one of the earliest video games, and designing the original Breakout game for the company Atari that had hired Jobs on the basis of the Pong game credit:

“… Wozniak related the story of one of his first collaborative projects with Apple CEO Steve Jobs. After Wozniak built his own version of Pong, one of the earliest video games, Jobs then took the game to Atari and got a job. “They had Steve working at night so he wouldn’t be around other people,” joked Wozniak.

“Then he got us a job,” Wozniak continued, “I designed the first Breakout game for Atari. So I didn’t really work there. They tried to hire me, but I said ‘Never leave Hewlett Packard, I love my company, I’m loyal.’” Wozniak added that it took them 4 straight days and nights to design the game. “We both got the sleeping sickness, mononucleosis,” said Wozniak.”

(Josh Ong, December 6, 2010, Apple Insider)

Now together the two started Apple Computer, and the popular Apple II computer was produced in 1977:

“… Jobs, who was also a Homebrew member, showed so much enthusiasm for Wozniak’s design that they decided to work together, forming their own company, Apple Computer. Their initial capital came from selling Jobs’s automobile and Wozniak’s programmable calculator, and they set up production in the Jobs family garage to build microcomputer circuit boards. Sales of the kit were promising, so they decided to produce a finished product, the Apple II; completed in 1977, it included a built-in keyboard and support for a colour monitor. The Apple II, which combined Wozniak’s brilliant engineering with Jobs’s aesthetic sense, was the first personal computer to appeal beyond hobbyist circles. …”

(William L. Hosch, Encyclopædia Britannica)

The critical importance of collaboration among a pair of technology pioneers is highlighted in the contrasting examples of Howard Aiken and Cuthbert Hurd versus Steve Wozniak and Steve Jobs: despite Aiken’s prominent computer pioneer status and early retirement from Harvard with an intense interest in starting a computer company, his close associate and former IBM executive Hurd only talked about collaboration plans with him but never put them into action; on the other hand, Wozniak the talented young computer designer did not want to leave his employer Hewlett-Packard at all, and yet the entrepreneurial adventurer Jobs persuaded Wozniak’s family to convince Wozniak to do it.

Without financing, Jobs and Wozniak started their company at Jobs’s family home garage, just like William Hewlett and David Packard had started theirs in a house garage in 1939 as noted in Part 5 (i).

Hewlett later commented on HP’s decision in 1976 not to pursue Wozniak’s personal computer idea by saying, “You win some, you lose some”:

“Regarding the missed opportunity, HP co-founder Bill Hewlett reportedly said, “You win some, you lose some.””

(Josh Ong, December 6, 2010, Apple Insider)

Glimpses into Hewlett-Packard’s own history may help shed light on William Hewlett’s circumspect and somewhat philosophical comment regarding HP’s missed opportunity with Steve Wozniak and the personal computer.

Back in 1939, it was in the garage of David Packard’s rented house in Palo Alto that Packard and Hewlett started their company, a garage since recognized as “the birthplace of Silicon Valley” by the State of California in 1989:

“In 1938 David and Lucile Packard got married and rented the first floor of the house at 367 Addison Avenue in Palo Alto. The simple one car garage became the HP workshop and the little shack out back became Bill Hewlett’s home. In 1989 California named the garage “the birthplace of Silicon Valley” and made it a California Historical Landmark.

Dave Packard had gone to Schenectady to work at General Electric. He was told that there was no future in electronics at General Electric and that he should instead concentrate on generators, motors and other heavier equipment. Bill Hewlett was finishing up his graduate work at Stanford and the two decided to pursue their earlier plan of starting their own business. The name HP (vs. PH) was chosen by a coin toss. For $45 per month, the Packards rented the first floor of the house, which was chosen specifically because it had a garage that they could work in. Bill Hewlett moved into the little shack next to the garage.”

(“The HP Garage – The Birthplace of Silicon Valley”, The Museum of HP Calculators)

As reviewed in Part 5 (ii), the founding of Hewlett-Packard was greatly encouraged by Stanford engineering professor Frederick Terman, a protégé of leading U.S. government science adviser Vannevar Bush of the World War II era, and an influential academic administrator who grew Stanford’s scientific research by utilizing Cold War-oriented research funding.

By the early 1950s, as quoted earlier from a book by James W. Cortada on U.S. computer development history, Hewlett-Packard was an “instrument supplier” for some of the Southern California aerospace companies actively engaged in computer development activity – even though at the time HP itself had no such ambition as recalled by Douglas Engelbart.

HP’s technological expertise in instrumentation was in serious demand in the arena of military weapons research:

“Although the company never developed weapons systems, it depended heavily throughout its history on military spending, because its instrumentation has been used to develop and test military products, particularly as weapons systems have become more dependent on electronic and semiconductor technologies. The military expertise of Hewlett-Packard was underscored in 1969 when U.S. Pres. Richard M. Nixon appointed Packard deputy secretary of defense, in which position he oversaw the initial plans for the development of two of the country’s most successful jet fighter programs, the F-16 and the A-10.”

(“Hewlett-Packard Company”, by Mark Hall, Encyclopædia Britannica)

The Encyclopædia Britannica article on Hewlett-Packard quoted above relates the history of HP’s benefiting from military spending on weapons development to David Packard’s rise in the national political scene – from a co-founder of a leading electronics and computer company in the growing Silicon Valley to serving as the U.S. Deputy Secretary of Defense under President Richard Nixon.

Packard was noted not only for overseeing the initial planning of the development of the U.S. Air Force’s important fighter jets the F-16 and the A-10, but also for overseeing the revision of military acquisition policy in 1969-1971 following a public controversy, just prior to the Nixon presidency, of cost overruns in the Air Force’s C-5A cargo aircraft project contracted to the Lockheed Corporation:

“… As it happened, when Nixon took office in 1969, the acquisition community was already in turmoil, the result of a high-profile controversy that began in mid-1968, when A. Ernest Fitzgerald, deputy for Management Systems in the Office of the Assistant Secretary of the Air Force for Financial Management, first testified before Congress about cost overruns on the C-5A cargo aircraft program. His appearances before congressional panels resulted in a series of investigations that proved to be very embarrassing for the Air Force and the Lockheed Corporation, prime contractor for the C-5A. Subsequent allegations were made that, after testifying, Fitzgerald was the subject of career reprisals by the Air Force’s senior leadership. These accusations only drew more public attention to the controversy. …

In this connection, President Nixon appointed David Packard, one of the founders of the Hewlett-Packard Corporation and a veritable legend in American business circles, to the post of deputy secretary of Defense in January 1969. With an extensive business background and a hands-on management style that stood in stark contrast with that of former Defense Secretary Robert McNamara, Packard seemed like a logical choice to tackle the problems of defense acquisition by revising policy and working closely with subordinates to repair the cultural rift that had developed between the services and OSD during McNamara’s tenure. One observer, writing in 1972, suggested that Packard was the embodiment of a “cult of personality in reverse,” a hero called upon to “put things right for the future.” …

… Shortly after taking office, Packard had formed the Defense Systems Acquisition Review Council (DSARC) as an advisory body reporting to the secretary of Defense. The council, formed in May 1969, established three progress milestones for acquisition programs, an important enhancement to the acquisition process. The milestones were defined as “program initiation decision,”, “full-scale development decision,” and “production decision.” … The DSARC was part of a long-term scheme to promote a kind of “decentralized centralization” over defense acquisition activities. OSD retained oversight authority over new acquisition programs, but Packard wanted the services to assume a larger role in the management of the acquisition process, with many functions devolving to the services. …

The spirit of “decentralized centralization” also could be found in the language of the landmark May 1970 memo, in which Packard articulated new principle for managing acquisition in the coming years. “The prime objective of the new policy guidance is to enable the services to improve their management of programs . . . . [T]he services have the responsibility to get the job done,” wrote Packard. “[I]t is the responsibility of OSD to approve the policies which the services are to follow, to evaluate the the performance of the services in implementing the approved policies, and to make decision on proceeding into the next phrase in each major acquisition program.” …”

(Shannon A. Brown with Walton S. Moody, “Defense Acquisition in the 1970s: Retrenchment and Reform”, in Shannon A. Brown, ed., Providing the Means of War: Historical Perspectives on Defense Acquisition, 1945-2000, 2005, United States Army Center of Military History and Industrial College of the Armed Forces)

I note that David Packard’s reform vision of “decentralized centralization” cited above gave the U.S. military “services”, i.e., branches, more powers in defense acquisitions.

These powers had rested with the Office of Secretary of Defense Robert McNamara during the John Kennedy and Lyndon Johnson presidencies; the above quote cited an observer describing Packard as the embodiment of a “cult of personality in reverse” from Robert McNamara.

Under McNamara, the Pentagon had centralized its decision-making powers:

“One of the most important elements in the McNamara approach to management during the 1960s was in the commitment to centralized decision making in OSD. The new Planning, Programming, Budgeting System correlated resource inputs with categories of performance … The newly created office of assistant secretary of defense for systems analysis employed more than one hundred professional personnel preparing and using parametric cost estimates in cost-benefit analyses for use by the secretary of defense and other decision makers in the Pentagon. …”

(J. Ronald Fox, Defense Acquisition Reform, 1960-2009: An Elusive Goal, 2011, Center of Military History, United States Army)

Interestingly, the former defense secretary who had taken a centralization approach to modernizing Pentagon’s decision making under Democratic presidents Kennedy and Johnson was actually a Republican, and was also a former president of a top U.S automaker just like earlier President Eisenhower’s secretary of defense Charles Wilson reviewed in Part 5 (i):

“As a registered Republican, McNamara became the first Republican appointed to Kennedy’s Cabinet. He was a Presbyterian, married, and father of three. His rapid rise at Ford had been through the financial and accounting side of the business, where he was brought in by Henry Ford II as one of a team known as “whiz kids” right after World War II.

… He had been president of Ford just over a month; he’d been selected the day after Kennedy was elected President. And McNamara was not the first Secretary of Defense to be plucked from Detroit. Charles Wilson, President Eisenhower’s first Secretary of Defense, was a former president of General Motors.”

(“Kennedy Selects Robert McNamara as Secretary of Defense”, by David Coleman, HistoryinPieces.com)

Cases such as the U.S. Army’s initiating the ENIAC electronic computer project at the University of Pennsylvania during wartime – over the serious objection of the U.S. scientific establishment led by Vannevar Bush, as discussed earlier – and establishing the Army Mathematics Research Center at the University of Wisconsin-Madison in peacetime as reviewed in Part 5 (i), suggest that the flexibility of acquisition decision making at the level of the military services could benefit companies like Hewlett-Packard that procured to military contracts directly or indirectly in the weapons technology arena.

But such focuses on the part of Hewlett-Packard and its co-founder David Packard, namely on military research and development and their benefits to the company, by the 1970s were a far distance from the aspirations of a younger generation of computer designers like Steve Jobs and Steve Wozniak.

At their first-ever published press interview, Jobs proclaimed that Apple Computer would like to donate computers to schools for the education of kids, so that “there would be an Apple in every classroom and on every desk” – an ambitious statement later recalled by tech writer Sheila (Clarke) Craven, who had conducted the interview in Apple’s start-up garage for her February 1977 article published in Kilobaud, The Small Computer Magazine:

““My interview with the two Steves took place while they were still in the folks’ garage,” Craven tells Business Insider. She remembers it this way:

One of the things Jobs told me was that they would make certain there would be an Apple in every classroom and on every desk, because if kids grew up using and knowing the Apple, they would continue to buy Apples and so would their kids. The computers would be donated by Apple Computer. I understand that when that article came out, orders starting pouring in, and Apple Computer was seriously launched.

At the time, Apple consisted of just the two Steves in Jobs’ parents’ garage. There was no office, Craven says. Craven spent four hours with the pair, including lunch. After Wozniak booted up the machine, Jobs loaded a game of Blackjack onto it to demonstrate its powers.”

(“This is the first news article ever written about Apple”, by Jim Edwards, May 5, 2015, Business Insider)

Quite different from making computers for kids to learn at school and to play video games, as the deputy defense secretary David Packard had held steadfast the stability of the civil society as a priority; in 1971, he authored a Pentagon document to justify, on the ground of “constitutional exceptions”, the use of military rule in the United States to handle civil disturbances:

“… The United States has contingency plans for establishing martial law in this country, not only in times of war, but also if there is what the Defense Department calls “a complete breakdown in the exercise of government functions by local civilian authorities.” What’s more, there’s a little-known 1971 memorandum prepared by the deputy secretary of defense which also provides justification for military control similar to martial law.

… Martial law is expected to be proclaimed by the president, although “senior military commanders” also enjoy the power to invoke it in the absence of a presidential order, according to a Department of Defense Directive signed in 1981 by Deputy Secretary of Defense Frank Carlucci—who has since become the secretary of defense. Despite the existence of these martial law contingency plans, Justice Department spokesperson John Russell says martial law could never be invoked in the United States, pointing to “the Posse Comitatus Act, [which] bars the military from engaging in law enforcement.”

The Posse Comitatus Act provides small comfort, however, because another Pentagon document, authored by Deputy Secretary of Defense David Packard in 1971, cites two “Constitutional exceptions” to the act’s restrictions. …

The Packard directive claims that Congress intended for there to be an exception to Posse Comitatus “when unlawful obstructions or rebellion against the authority of the United States renders ordinary enforcement unworkable. . . .” Like the Carlucci document, Packard’s directive says turning over law enforcement to the army will “normally” require a Presidential Executive Order, but that this requirement can be waived in “cases of sudden and unexpected emergencies . . . which require that immediate military action be taken.”

During last year’s congressional hearings into the Iran-contra scandal, a brief reference was made to planning efforts by FEMA and by Lt. Col. Oliver North at the National Security Council to outline a martial law scenario premised upon, among other things, a domestic crisis involving national opposition to a U.S. military invasion abroad. …”

(“Could It Happen Here?”, by Dave Lindorff, April 1988, Mother Jones)

Judging from what the Reagan White House National Security Council official Lieutenant Colonel Oliver North did planning for during the 1980s, namely the use of martial law to handle a crisis arising from domestic opposition to a U.S. military invasion abroad, and what the 1971 Packard document had outlined, I can imagine that domestic military rule would have occurred in the Nixon era, or the Reagan era, had there been the Johnson-era nationwide escalating anti-Vietnam War protests – as in Part 2, in 1965 Stephen Smale, later my Ph.D. adviser, was a leader in starting the UC Berkeley anti-war movement that inspired the growth of the national protests, but before long returned to concentrating on mathematical research.

Given Hewlett-Packard’s history and focuses as reviewed here, Bill Hewlett made sense when he later mused about not taking up Steve Wozniak’s personal computer idea in 1976, “You win some, you lose some”.

And I would ponder and wonder: what could the win be for Mr. Hewlett and for Mr. Packard to make it easier for “kids” to play the “Pong” game and the “Breakout” game?

Besides, for HP there was the irony that Wozniak was the son of a weapons development engineer at Lockheed – the company that had spent too much Pentagon money that Nixon decided to bring in Packard to correct it.

As for Steve Wozniak, being a Hewlett-Packard loyalist at the time he may not have taken adequate notice – but for Steve Jobs’s worldly enthusiasm and persistence – of the cultural evolution of Silicon Valley over the years, from the ethos of HP as its founding company to Fairchild Semiconductor’s championing of civil and commercial industrial developments, and further to the plurality of start-up companies springing up since the late 1960s, as reviewed earlier.

So Hewlett-Packard didn’t help Wozniak in his quest to develop and market a personal computer. But paradoxically, neither did the Silicon Valley-trendsetting Intel, “the fairest of the “Fairchildren”” with its entrepreneurial spirit and egalitarian culture, and the inventor of the microprocessor, help Apple Computer with it – in the sense that although Wozniak based his initial personal computer design on the Intel 8080 microprocessor it was too expensive to be used in the early Apple computers:

“Steve Wozniak made a decision very early on at Apple that would prove one of the company’s most fateful ever. When “Woz,” a prank-loving 26-year-old who loved to tinker with machines, designed the very first Apple computer, he decided to use a microprocessor called the MOS Technology 6502, based on the design of Motorola Inc.’s 6800, essentially because it was cheaper than anything else he could find. Intel’s 8080 chip was selling for $179 at the time, and Motorola’s 6800 fetched $175. The MOS Technology chip, made by a Costa Mesa, California, company, cost only $25. …

The decision to go with the Motorola technology was fateful, because Intel would gain the license from IBM to make the microchips that went into almost every IBM-compatible computer. Motorola was a big company in its own right, a giant in cellular phones and pagers. But Apple, which soon after that first design by Woz began using Motorola chips exclusively, became Motorola’s only sizable customer for personal computer microprocessors. Intel’s whole life, on the other hand, revolved around microchips. In fact, it had been a young Intel engineer named Marcian E. Hoff Jr. who had invented the microchip in 1971, making the PC revolution possible.”

(“They Coulda Been a Contender”, by Jim Carlton, November 1997, Issue 5.11, WIRED)

As quoted, Wozniak chose a $25 MOS Technology 6502 microprocessor based on the design of Motorola’s $175 6800 microprocessor, and not Intel’s $179 8080 microprocessor, a fateful decision with the consequence that Intel then became the microprocessor supplier for IBM but not Apple.

Wozniak had in mind making Apple computers affordable for students and teachers, with his first Apple computer debuting in a math class at Windsor Junior High School in Windsor, California:

“Before the iPod, the Macintosh, or even the formation of Apple Computer Company on April Fool’s Day 1976, there was the Apple I. Designed by Steve “Woz” Wozniak, then an engineer at Hewlett-Packard, it was less a personal computer than the bare essentials of one: the circuit board you see in the image at left is the Apple I (buyers had to hook up their own keyboards, displays, and power supplies). This computer, the very first Apple I made, was first used in a math class at Windsor Junior High School in Windsor, CA, in 1976 and donated to the LO*OP Center, a nonprofit educational organization run by Liza Loop. …

… Wozniak pored over ­integrated-­circuit specifications and engineered the Apple I so that different processes could share the same chips, reducing the overall part count. This, plus the use of cheaper items such as a $20 MOS Technology 6502 microprocessor rather than the more common $175 Motorola 6800, enabled him and Steve Jobs to offer the Apple I for the somewhat affordable price of $666.66… Loop, who is also the director of the History of Computing in Learning and Education Project, agrees: “Woz wanted this simple, low-cost design so that the Apple would be affordable for students and teachers.””

(“Hack: The Apple I”, by Daniel Turner, May 1, 2007, MIT Technology Review)

It looked like launching a great adventure on April Fool’s Day was not an unpopular move, preferred by Steve Jobs and Steve Wozniak starting Apple Computer and not just by Stephen Hawking who, as in Part 4, had his first popular book, A Brief History of Time, published on that day in 1988.

The second previous quote above, from a Wired magazine article, cited the company MOS Technology that made the cheap microprocessor chip Wozniak selected as based in Costa Mesa, which is in Southern California.

So would it not appear that, with all that had been achieved by major Silicon Valley semiconductor companies like Fairchild Semiconductor and Intel, in the 1970s Southern California was still more productive than Northern California in semiconductor chip production?

Not exactly, because MOS Technology was really based in, and its 6502 microprocessor designed and manufactured in, Valley Forge just outside of Philadelphia, Pennsylvania, even though a lower-power version, CMOS 6502, was developed in Costa Mesa for use in “hand-held” calculators:

“VALLEY FORGE, PA—Back in the fall of 1976, before there was any microcomputer market to speak of, Commodore International acquired MOS Technology to help its struggling calculator business. MOS had designed the 6502 chip, which is found today in innumerable microcomputers, including models by Apple, Atari and Hewlett-Packard.

What finally leaves the MOS plant is integrated circuits, which then go to Japan or California for final assembly into computers.

There is also a systems/assembly facility in Santa Clara, California (see related article on page 18). A support facility in Costa Mesa, California, is working on a CMOS version of the 6502 and 6500 series of microprocessors that will provide lower-power, “equal- or better-speed microprocessor” for “use in hand-held applications”…”

(“MOS Technology is Commodore’s ‘edge’”, by David Needle, April 26, 1982, Volume 4, Number 16, InfoWorld)

So it was near Philadelphia, the birthplace of the world’s first general-purpose electronic computer ENIAC, that the MOS 6502 was designed and manufactured so cheaply that it operated in so many microcomputers, including Apple computers, computers by Atari which Jobs and Wozniak had done video-game design for, and even Hewlett-Packard microcomputers!

Apparently Steve Wozniak started the personal computer trend very well and subsequently Hewlett-Packard in 1980 entered the market with the HP 85 personal computer.

(“HP-85 personal computer, 1980”, Hewlett-Packard)

As mentioned earlier, microchips designed in the 1960s by Fairchild Semiconductor for the American moon-landing program’s Apollo Guidance Computers were also manufactured by a Philadelphia-based company, Philco.

In the year 1980 when HP entered the personal computer market, Apple’s annual sales were already in the tens of millions of dollars and growing fast; IBM was also jolted into immediate action; in that year, Apple Computer shares began trading on the stock market, instantly making Apple a billion-dollar company and turning Jobs and Wozniak into $100 millionaires:

“The Apple II was first sold in 1978 and made $700,000 worth of sales that year. The following year, sales were $7 million, and year after $48 million. In 1980, sales doubled again and the Apple company went public, giving Jobs and Wozniak $100 million each.

IBM couldn’t ignore that. In July 1980, it set up a project to get into the personal computer business within a year. Because of the time constraint, the project team scrapped the usual IBM practice of making every major component in the computer themselves and decided to build their computer from standard components that anyone could buy.”

(David Manners and Tsugio Makimoto, Living with the Chip, 1995, Chapman & Hall)

Besides Jobs and Wozniak, Apple Computer’s stock market debut produced around 300 instant millionaires, about 40 of them Apple investors and employees; that was more millionaires than any other company had produced in history up to that point:

“On December 12, 1980, Apple launched its IPO (initial public offering) of its stock, selling 4.6 million shares at $22 per share with the stock symbol “AAPL” on the NASDAQ market.

The shares sold out almost immediately and the IPO generated more capital than any IPO since Ford Motor Company in 1956. Instantly, about 300 millionaires, some 40 of which are Apple employees and investors, are created. That is more millionaires than any company in history had produced at that time. Steve Jobs, the largest shareholder, made $217 million dollars alone.

By the end of the day, the stock had increased in value by almost 32% to close at $29, leaving the company with a market value of $1.778 billion.”

(“Apple IPO makes instant millionaires, December 12, 1980”, by Suzanne Deffree, December 12, 2015, EDN Network)

As quoted, Apple Computer raised more capital in the IPO, i.e., initial public offering, of its stock than any other company after Ford Motor Company in 1956.

So, within a few short years after HP’s rejection of Wozniak’s personal computer idea, he and Jobs had surpassed Hewlett and Packard as successful industrial entrepreneurs – reaching a level of fame closer to Henry Ford and Robert McNamara.

After Fairchild Semiconductor and Intel, Apple started something in entrepreneurship and in technological progress with a revolutionary impact on not only Silicon Valley but American consumers: according to the U.S. Census Bureau data, by 1984, 8.2% of American households had a computer; 5 years later in 1989, the figure grew to 15%; by 1993, it was 22.8%; it then became 36.6% in 1997 and 42.1% in 1998; and by 2000, over half of the American households, or 51%, had a computer.

(“Home Computers and Internet Use in the United States: August 2000”, September 2001, U.S. Census Bureau)

Still, credits to Apple’s innovative success are due not only Jobs, Wozniak and Apple’s employees, but also others outside of Apple, such as Silicon Valley pioneer Douglas Engelbart who in the mid-1950s, as in Part 5 (i), had encountered the lack of interest in computer development on the part of UC Berkeley, Stanford and Hewlett-Packard, but persevered and in the early 1960s invented the computer mouse; after Engelbart, it was a group of graduates of Stanford’s interdisciplinary product design program who in the early 1980s turned the expensive and unreliable mouse into an inexpensive and very helpful user device Steve Jobs wanted:

“Dean Hovey was hungry. His young industrial design firm, Hovey-Kelley Design, had been working on projects for Apple Computer for a couple of years but wanted to develop entire products, not just casings and keyboards. Hovey had come to pitch Apple co-founder Steven Jobs some ideas. But before he could get started, the legendary high-tech pioneer interrupted him. “Stop, Dean,” Hovey recalls Jobs saying. “What you guys need to do, what we need to do together, is build a mouse.”

Hovey was dumbfounded. A what?

Jobs told him about an amazing computer, code-named Alto, he had just seen at Xerox’s Palo Alto Research Center (PARC). In early 1980, most computers (including Apple’s) required users to memorize text commands to perform tasks. The Alto had a graphical user interface—a symbolic world with little pictures of folders, documents and other icons—that users navigated with a handheld input device called a mouse. Jobs explained that Apple was working on two computers, named Lisa and Macintosh, that would bring that technology to market. The mouse would help revolutionize computers, making them more accessible to ordinary people. …

Just one problem: a commercial mouse based on the Xerox technology cost $400, malfunctioned regularly and was nearly impossible to clean. That device—a descendant of the original computer mouse invented by Douglas Englebart at the Stanford Research Institute in the early 1960s—was a masterpiece of high-concept technology, but a hopeless product. Jobs wanted a mouse that could be manufactured for $10 to $35, survive everyday use and work on his jeans.

“We thought maybe Steve wasn’t getting enough meat in his diet,” says Jim Sachs, a founding member of Hovey-Kelley, “but for $25 an hour, we’d design a solar-powered toaster if that’s what he wanted.” …

They did. The mouse’s evolution “from the laboratory to the living room,” as one of its designers puts it, is not well known—even some Apple fanatics aren’t familiar with it—but it reveals something of the personalities of its designers, the Stanford program that trained them and even the history of Silicon Valley. Everyone knows that the University has helped shape the region, but the influence is often described as a function of great individuals like Frederick Terman…

When Hovey-Kelley was asked to design the Apple mouse, the firm was a two-year-old start-up. Hovey and David Kelley, as well as most of the firm’s other early members, had met as graduate students in Stanford’s product design program. An interdisciplinary program that combines mechanical engineering, art and, often, math, physics and psychology, it was founded in 1958 by Robert McKim. …

… The Apple mouse transformed personal computing. Although the expensive Lisa flopped, the Macintosh, released in 1984, made the graphical user interface the industry standard. Microsoft responded with Windows, and its own mouse—also engineered by Jim Yurchenco. “We made a mouse mass-producible, reliable and inexpensive,” says Sachs, “and hundreds of millions of them have been made.””

(“Mighty Mouse”, by Alex Soojung-Kim Pang, March/April 2002, Stanford Magazine)

For his part, Steve Wozniak credited his success in building the first Apple computer to his having been “extremely lucky”:

“I built the first Apple prototype myself, before there was suggestion to start a company. I gave out schematics and code listings of it at the Homebrew Computer Club. …

In 1975 I decided to build a full computer, that would be able to run a programming language. In 1970 I’d told my dad that someday I’d own a 4K computer capable of running Fortran programs, which was my favorite high school pastime. We didn’t have computers in our high school, but my electronics teacher arranged for me to visit a company in Sunnyvale and program a computer there once a week. Due to not having money, I couldn’t consider a $370 Intel 8080 microprocessor. But MOS Technology came out with the 6502 for $20. More important, in a day when there was no store where you could actually buy a microprocessor, the new 6502 was to be introduced and sold over the counter at a show, Wescon, in San Francisco. All of this was extremely lucky for me.

So to build my first Apple computer (I’d actually built a smaller computer of my own design with no microprocessor, the “Cream Soda Computer,” in 1970) by joining my terminal (input and output) with the 6502 microprocessor and some RAM. I chose dynamic RAM whereas all the other cheap hobbiests chose static RAM. My goal in any design was to minimize the board space and chip count. Well, the new 4K-bit dynamic RAMs were the first RAMs to be cheaper, per bit, than magnetic core memories. It was a change in technology as significant as the scientific calculators of Hewlett Packard (which I helped design), which totally replaced slide rules.”

(“Letters-General Questions Answered”, last updated: July 18, 2001, Woz.org)

From what he said above, Wozniak’s extreme luck came because of the MOS Technology microprocessor that not only came out onto the market just when he wanted to build a full computer, but that was being shown at the Wescon show in San Francisco so he could go buy it for $20, when other microprocessors were not only expensive but unavailable at any retail store.

Besides these, Wozniak mentioned several favorable factors: he had programmed computers since high school, with the help of his electronics teacher to access a computer in a Sunnyvale company; he had had chats with his father about computers; he had designed and built a small computer, the “Cream Soda Computer”, without a microprocessor in 1970; he had experiences at Hewlett-Packard helping design scientific calculators; and, when he chose to use dynamic RAMs as the computer memory, there was the new 4K-bit dynamic RAMs that were “the first RAMs to be cheaper, per bit, than magnetic core memories” – not unlike the MOS Technology microprocessor being not only a new but affordable product.

Of these factors, I would say that the emergence of the MOS Technology microprocessor and the emergence of the new 4K-bit dynamic RAMs were “lucky” for Wozniak, because the timings of market availability and affordability of new products coincided with his needs for them. The other factors were primarily consequences of his talent, skills and studious work: he got to do calculator design at HP because he had the technical skills and prior experience designing and building a small computer without the microprocessor in 1970, and that prior experience no doubt had benefited from his familiarity with computers, having done programming regularly in his high school years.

Of particular interest to my comment earlier about Howard Aiken’s intent of starting a microcomputer company long before the microprocessor was invented, Wozniak showed in 1970 that one could build a small computer without having a microprocessor.

In addition, though Wozniak did not delve into details in the above quote from his response to an email question, his “favorite high school pastime” of computer programming and later building a computer on his own probably had a family influence, given that his father, with whom he had discussions as quoted, was an electrical engineer at Lockheed Missiles and Space Company.

In fact, on other occasions Wozniak has described in great details about his having been nurtured passionately by his father during childhood and youth; his father tutored him on electronics since when he was 3 years old, and taught him to start his first electronics building project at the age of 6, something that instilled in him an exciting sense of superiority over other kids; from there on it was project after project, through elementary school and through 8th grade of high school, guided by his “single greatest influence” – his father:

“The other thing my dad taught me was a lot about electronics. Boy, do I owe a lot to him for this. He first started telling me things and explaining things about electronics when I was really, really young—before I was even four years old. This is before he had that top secret job at Lockheed, when he worked at Electronic Data Systems in the Los Angeles area. One of my first memories is his taking me to his workplace on a weekend and showing me a few electronic parts, putting them on a table with me so I got to play with them and look at them. …

… In fact, my very first project—the crystal radio I built when I was six—was really all because of my dad. It took me a very long time in my life to appreciate the influence he had on me. He started when I was really young, helping me with these kinds of projects.

Dad was always helping me put science projects together, as far back as I can remember. When I was six, he gave me that crystal radio kit I mentioned. It was just a little project where you take a penny, scrape it off a little, put a wire on the penny, and touch it with some earphones. Sure enough, we did that and heard a radio station. Which one, I couldn’t tell you, but we heard voices, real voices, and it was darned exciting. I distinctly remember feeling something big had happened, that suddenly I was way ahead—accelerated—above any of the other little kids my age. And you know what? that was the same way I felt years later when I figured out how resistors and lightbulbs worked.

All through elementary school and through eighth grade, I was building project after electronic project. There were lots of things I worked on with Dad; he was my single greatest influence.”

(Steve Wozniak with Gina Smith, iWoz: Computer Geek to Cult Icon, 2006, W. W. Norton & Company)

In the above quoted from his autobiography, “Woz” mentioned that his father’s work at Lockheed was a “top secret job”.

His dad was in the missile program and never told him any details about it, in an era when even the space program was “top secret”, as Woz explained:

“I did know Dad was an engineer, and I knew he worked in the missile program at Lockheed. That much he said, but that was pretty much it. Looking back, I figure that because this was in the late 1950s and early 1960s at the height of the Cold War, when the space program was so hot and top secret and all, probably that’s why he couldn’t tell me anything more about it. What he worked on, what he did every day at work, he’d say absolutely nothing about. Even up to the day he died, he didn’t give so much as a hint.”

(Steve Wozniak with Gina Smith, 2006, W. W. Norton & Company)

The space program could not have made Woz’s dad so tight-lipped all his life, could it? His father’s was Lockheed’s submarine-based Polaris missile program, mentioned earlier, as Woz later figured out:

“Now, on my own, I managed to put together little bits and pieces. I remember seeing NASA-type pictures of rockets, and stuff related to the Polaris missile being shot from submarines or something, but he was just so closemouthed about it, the door slammed there.

I tell you this because I’m trying to point out that my dad believed in honesty. Extreme honesty. Extreme ethics, really. That’s the biggest thing he taught me. …”

(Steve Wozniak with Gina Smith, 2006, W. W. Norton & Company)

As discussed earlier and in Part 5 (ii), Lockheed Missiles and Space Company was founded in 1956 in the nascent Silicon Valley region around the time of the semiconductor industry’s arrival there.

Woz’s father came to this company in 1958 from Southern California when Waz was 7 years old, moving to a house on Edmonton Avenue in Sunnyvale, “in the heart of” Silicon Valley and in “the best climate in America”:

“We spent most of my early years in Southern California, where my dad worked as an engineer at various companies before the secret job at Lockheed.

But where I really grew up was Sunnyvale, right in the heart of what everyone now calls Silicon Valley. Back then, it was called Santa Clara Valley. I moved there when I was seven. … Our street, Edmonton Avenue, was just a short one-block street bordered by fruit orchards on three of four sides. …

When I think of that street, looking back, I think it was the most beautiful place you could imagine growing up. It wasn’t as crowded back then, and boy, was it easy to get around. It was as moderate of temperature as anywhere else you could find. In fact, right around the time I moved there—this was 1958—I remember my mother showing me national articles declaring it to be the best climate in America. …”

(Steve Wozniak with Gina Smith, 2006, W. W. Norton & Company)

Woz told how his father, “an extremely good teacher and communicator”, gave him “classical electronics training” from the beginning starting when he was 7:

“Because my dad was an engineer, there were all kinds of interesting things lying around my house. And when you’re in a house and there are resistors lying around everywhere, you ask, “What’s that? What’s a resistor?” And my dad would always give me an answer, a really good answer even a seven-year-old could understand. He was just an extremely good teacher and communicator.

He never started out by trying to explain from the top down what a resistor is. He started from the beginning, going all the way back to atoms and electrons, neutrons, and protons. He explained what they were and how everything was made from those. I remember we actually spent weeks and weeks talking about different types of atoms and then I learned how electrons can actually flow through things—like wires. Then, finally, he explained to me how the resistors work—not by calculations, because who can do calculations when you’re a second grader, but by real commonsense pictures and explanations. You see, he gave me classical electronics training from the beginning. …”

(Steve Wozniak with Gina Smith, 2006, W. W. Norton & Company)

With the training and guidance by his father, Woz excelled in science projects at the elementary school age. In a 2012 article, he described his accomplishments in the elementary school years:

“I enjoyed early electronic kits with buttons and buzzers but that was a mild start which could have gone in other directions. Science fair projects in elementary school, really solidified my direction. A couple of simple projects, a flashlight apparatus with rubber bands instead of solder, and an electrolysis project were also not determining of my real interest. But I found a journal in a hall closet with descriptions of binary numbering and logic gates and storage devices.

When I discovered that a 9-year old could understand this stuff, I knew it would be my passion forever. I didn’t think there were jobs in computers but I would love them as a pastime. This interest was solidified by large construction projects (ham radio transmitter and receiver from kits after learning and getting my license), atomic electron orbital display (92 lights, 92 switches, tons of relays, some diodes for logic), a tic-tac-toe computer (about 100 transistor circuits for rules that I made from playing games, although later in life I minimized it to about 10-20 rules for simplicity), a 10-bit binary adder-subtractor. In no case did I copy existing logic or circuits and that forced me to learn it all well.”

(How Steve Wozniak Became the Genius Who Invented the Personal Computer, by Steve Wozniak, July 17, 2012, Gizmodo)

Very impressive, Woz not only played the tic-tac-toe game but built a “tic-tac-toe computer” with about 100 transistor circuits for playing the game – well over a decade before building his own version of the early commercial video game “Pong” and designing the first version of the “Breakout” video game as mentioned earlier. Most impressively as recalled, Woz did the projects by learning it well and doing it on his own, not copying existent designs.

In building his “tic-tac-toe computer”, Woz benefited from not only his father’s teaching on the logics of transistors and circuits and on how to build them, but also the assistance of none other than the Silicon Valley-trendsetting Fairchild Semiconductor, which gave Woz a large quantity of “cosmetic defects” transistors for free, courtesy of his father:

“In sixth grade my father taught me how transistors work, leading into logic circuits. I learned how to fashion OR gates from resistors or diodes, AND gates from diodes, and invertors from transistors.

Although they were still expensive, my dad got local transistor companies (Fairchild) to donate “cosmetic defects” of hundreds of diodes and transistors to me. My dad taught me how gates could make decisions based on inputs. He said how you could combine all the inputs of a tic-tac-toe game (which squares had “X”, which had “O” and which were empty) and gates could decide the best response. Unfortunately I didn’t come up with a great simplification (along the lines I used for my 6th graders last year) and it took hundreds of gates laid out on a 3’ by 4’ piece of plywood with components soldered to nails. I tried hard but couldn’t get the “tic-tac-toe computer” into the 6th grade science fair.”

(“THE WOZ INTERVIEW!”, by Auri Rahimzadeh, 1995, Woz.org)

Steve Wozniak was born on August 11, 1950, as in his Encyclopædia Britannica biography cited earlier. In his 6th grade, typically 11-12 years old, Woz’s “tic-tac-toe computer” was probably made in 1961-1962. That would be the time when Howard Aiken retired from Harvard and most likely discussed with Cuthbert Hurd about starting a “microcomputer computer company”, with a Lockheed assistant director of engineering doing the design work – in Sunnyvale where Aiken was a consultant at Lockheed Missiles and Space Company.

Aiken’s goal unfulfilled in his lifetime would eventually be achieved by a Lockheed Missiles and Space Company engineer’s son, a 6th grader at that earlier time but already making a special “tic-tac-toe computer”.

In his high school years, Wozniak learned not only programming but designing computers on his own, as he recalled in the 2012 article cited earlier:

“In high school I got to program a computer and came across a manual for an existing minicomputer. I took my elementary school logic experience and tried to teach myself how to design a computer, given its architecture. I had no books on how to do this. I shut my door and worked alone. After a few tries over months, I had a pretty decent design with the chips of the day.

Then I started designing every minicomputer made. I’d design them over and over, making a game to save parts. I had no books but came up with good techniques because it was for myself. …”

(Steve Wozniak, July 17, 2012, Gizmodo)

The first machine Woz designed and built that was close to a real computer was an Adder/Subtractor in his 8th-grade year. To Woz’s disappointment, it won him only an honorable mention at the Bay Area Science Fair, where his experience made him feel that the competition’s judging outcomes were unfairly predetermined:

“The Adder/Subtractor wasn’t more complicated in terms of size or construction time than the tic-tac-toe machine, but this project actually had a goal that was closer to real computing. A more important purpose than tic-tac-toe. …

My project had a function, a real function that was useful. You could input numbers, add or subtract one, and see your answer.

But here’s the thing. I took it down to the Bay Area Science Fair one night, to set it up before the day of judging. Some people showed me where to put it and asked me if I’d like to tell them about it. I told them no, figuring that I’d just tell them the story on judging day. By then I’d gotten kind of shy. Looking back, I think I may have turned down the judges without knowing it.

When I showed up on judging day, all the projects already had their awards. The judging had already happened somehow! I had an honorable mention, and there were three exhibits that had higher awards that mine. I saw them and remember thinking they were trivial compared to mine, so what happened? I then looked in the fair brochure and those three were all from the school district that was putting on the fair.

I thought, Hey, I’ve been cheated. But that night, I showed the machine and talked to lots of people—including, I’m sure, the real judges—and it seemed like they really understood how big my project was, I mean, it was great and I knew it and everyone knew it. I was able to explain how I’d used logic equations and gates and how I’d combined gates and transistors with binary number (1s and 0s) arithmetic to get the whole thing working.”

(Steve Wozniak with Gina Smith, 2006, W. W. Norton & Company)

Fortunately for Woz, the U.S. Air Force gave him its top award for the Bay Area Science Fair. The ‘VIP’ treatment coming with the award from the Air Force gave this 13-to-14-year-old a real boost of confidence as well as his “love for flying”; as Woz recalled in his autobiography, “that Adder/Subtractor was such a key project in my getting to be the engineer who ended up building the first personal computer”:

“After that, the Air Force gave me its top award for an electronics project for the Bay Area Science Fair, even though I was only in eighth grade and the fair went up to twelfth grade. As part of the award, they gave me a tour of the U.S. Strategic Air Command Facility at Travis Air Force Base. And they gave me a flight in a noncommercial jet, my first-ever flight in any plane. I think I might have caught my love for flying then.

When I look back, that Adder/Subtractor was such a key project in my getting to be the engineer who ended up building the first personal computer. This project was a first step to that. …”

(Steve Wozniak with Gina Smith, 2006, W. W. Norton & Company)

Or as noted earlier that in his autobiography Wozniak referred to his Lockheed missiles engineer father as his “single greatest influence”, in awarding Woz for his 8th-grade invention that would become his first step towards “the engineer who ended up building the first personal computer”, the U.S. Air Force likely saluted not only Woz’s accomplishment but also his father’s engineering career associated with the Air Force.

In fact, his father had helped him come up with the initial idea of building the Adder/Subtractor, according to Woz in an interview over a decade before his autobiography:

“In eighth grade my dad showed me a book of computer reports which were all interesting to me. I learned Boolean Algebra basics there. The book had logic diagrams of a binary adder (1 bit with carry in and out) and of a binary subtractor. We came up with the idea of building one. …”

(Auri Rahimzadeh, 1995, Woz.org)

Courtesy of his father, during his high school years Fairchild Semiconductor, which had given him “cosmetic defects” transistors for his 6th-grade tic-tac-toe computer, continued to provide Woz with help in designing computers:

“… One time in high school, I was trying to get chips for a computer I’d designed. My dad drove me down to meet an engineer he knew at Fairchild Semiconductor, the company that invented the semiconductor. I told him I’d designed an existing minicomputer two ways. I found out that if I used chips by Sygnetics (a Fairchild competitor), the computer had fewer chips than if I used Fairchild chips.

The engineer asked me which Sygnetics chips I’d used.

I told him the make and model number.

He pointed out that the Sygnetics chips I’d used in the design were much larger in physical size, with many more pins and many more wires to connect, than the equivalent Fairchild chips.

I was stunned. Because he made me realise in an instant that the simpler computer design would really have fewer connections, not simply fewer chips. So my goal changed, from designing for fewer chips to trying to have the smallest board, in square inches, possible.

Usually fewer chips means fewer connections, but not always.”

(Steve Wozniak with Gina Smith, 2006, W. W. Norton & Company)

As Wozniak recalled, even in high school years he already had the goal of designing computers that were small in size, and a conversation with a Fairchild Semiconductor engineer gave him the right technical perspective in the subject matter.

Thus, naturally, designing and building a microcomputer or personal computer was the next step for Wozniak when the technological components, most importantly the microprocessor, became available, as he recalled:

“… The Data General NOVA came out and had a very different architecture which wound up taking half as many chips, due to being designed around available parts very well. it was a very structured architecture. I told my dad I’d someday have a 4K NOVA (enough for a programming language). He said it cost as much as a house down payment. Throwing down the gauntlet, I countered that I’d live in an apartment.

The day I discovered that microprocessors were much like these minicomputers I’d designed back in high school, the formula for an affordable 4KB computer popped into my head instantly, remembering my old goal. Thankfully I’d been through a lot of stages leading up to this, building games around my TV (the only free output device) and terminals for the Arpanet. You do a lot when you love electronics and have little chance of ever having a girlfriend or wife.”

(Steve Wozniak, July 17, 2012, Gizmodo)

As Wozniak indicated, even after he became an experienced amateur computer designer, he continued to consult his father on important projects he would like to do.

And of course, as discussed earlier, in order to persuade Steve Wozniak to start Apple Computer with him, Steve Jobs went to persuade Wozniak’s family first – no doubt as I have reviewed, Woz’s father was the key!

A reason why Woz’s electrical engineer father had so much influence in the son’s personal development in the computer field may be that the father was in fact a pioneer in the field of integrated circuits, as Woz recalled:

“Because my father was involved with the earliest IC’s in regard to his lockheed work, I went to trade shows when only 10 years old and saw the first chips with 2 transistors on one chip of silicon (germanium), and the promise of 6 to 10 transistors on a chip in the near future. Over the years, my father had manuals around the house that caught my attention, with the early IC’s. During high school I discovered minicomputer manuals and started putting chips together to make computer designs. I was totally self taught in this regard, designing alone in my room.”

(Auri Rahimzadeh, 1995, Woz.org)

Recall that the integrated circuit was invented by Robert Noyce of Fairchild Semiconductor and Jack Kilby of Texas Instruments, independently in 1958-1959. In Woz’s case, when he was 10 years old – presumably in 1960 or 1961, not long before starting his tic-tac-toe computer project – his father was involved in IC development at Lockheed and he got to attend trade shows with his father; then over the years, his father brought home various IC manuals and minicomputer manuals, and Woz got to read them and learn to design computer alone in his room.

As reviewed earlier, the Harvard early computer pioneer Howard Aiken’s later consultancy for Lockheed Missiles and Space Company focused on switching circuit design. I would not be surprised if, in fact, Aiken was a consultant for some of the projects Wozniak’s father was a member of – as quoted earlier from Woz’s autobiography, due to secrecy his father never told him much about the work.

A very relevant and intriguing question arising from my review of this history is: had Aiken and Cuthbert Hurd, and the Lockheed assistant director of engineering already doing the computer design work, in the 1960s, gone forward in starting a new company, would Woz’s father have been a prospective top engineer to be recruited to this “first microcomputer computer company in the world”?

I understand that Woz’s father could be only one of many engineers Lockheed had in this field, and not as distinguished as Aiken the consultant with a prominent pioneer status. So what could be the chance that this one engineer, with a talented young son learning to do electronics projects at home, be chosen for such a brave new venture in those early years?

One may reason that Steve Wozniak’s father was already famous, in some way, that could be a factor favoring him. In his autobiography, Woz told of his father’s former fame at California Institute of Technology as the best football quarterback that university ever had:

“According to my birth certificate, my full name is Stephan Gary Wozniak, born in 1950 to my dad, Francis Jacob Wozniak (everyone called him Jerry), and to my mom, Margaret Louise Wozniak. My mother said she meant to name me Stephen with an e, but the birth certificate was wrong. So Stephen with an e is what I go by now.

I forgot to mention before that my dad was kind of famous, in his own way. He was a really successful football player at Caltech. People used to tell me all the time that they used to go to the games just to see Jerry Wozniak play. …

Once, at De Anza, my quantum physics teacher said, “Wozniak. That’s an unusual name. I knew a Wozniak once. There was a Wozniak who went to Caltech.”

“My father,” I said, “he went to Caltech.”

“Well, this one was a great football player.”

That was my father, I told him. He was the team’s quarterback.

“Yes,” the teacher said. “We would never go to football games. but at Caltech, you had to go just to watch Jerry Wozniak. He was famous.”

You know, I think my dad was the one good quarterback Caltech ever had. He even got scouted by the Los Angeles Rams, though I don’t think he was good enough to play pro. Still, it was neat to hear from a physics teacher that he remembered my dad for his football. It made me feel like I shared a history with him. The teacher once brought me a Caltech paper from back in those days with a picture of my dad in his uniform. …”

(Steve Wozniak with Gina Smith, 2006, W. W. Norton & Company)

As told, when Woz was a De Anza College student his quantum physics teacher had known his father Jerry Wozniak from the father’s star football quarterback days at Caltech.

Okay, the fame wasn’t in engineering; but it was rare that a former university star football quarterback became an engineer in the IC/computer fields at a military aerospace company, and it likely made Jerry Wozniak more noticed than his fellow engineers in one respect.

And as Woz’s De Anza quantum physics teacher said, “Wozniak” was “an unusual name” – as a matter of fact it was unusual in the context of the history of early electronic computers.

The first general-purpose electronic computer, the development of which involved Howard Aiken’s competitor John von Neumann, prominent mathematician who has been regarded as “the father of computers” as in Part 5 (i), was named ENIAC for the full name: Electronic Numerical Integrator And Computer.

As ENIAC was being completed in the mid-1940s, the project group at the University of Pennsylvania’s Moore School of Electrical Engineering planned for the next computer to be named EDVAC; soon von Neumann, returning to the Institute for Advanced Study at Princeton, started the IAS computer project; and of particular interest, by 1950 the National Bureau of Standards developed the computers SEAC and SWAC as reviewed in Parts 5 (i) & (ii).

Most of these early computer names end with “AC”; but ENIAC was the only one ending with “NIAC”, and the name Wozniak is similar in the sense that it also has two syllables the second of them, “niak”, differs from “NIAC” by only the last letter and is pronounced the same.

While it appeared a random coincidence, that a person with a name similar to the famous first electronic computer became an engineer in the computer field and his son with that name later became famous for developing the first personal computer, there were other interesting coincidences.

As in Part 5 (i), the RAND Corporation in Santa Monica, the Cold War think-tank where von Neumann was a leading strategist, in the early 1950s developed a computer following von Neumann’s design, and named it in honor of von Neumann as JOHNNIAC, i.e., also ending with “ENIAC”. JOHNNIAC’s full name is: John v. Neumann Numerical Integrator and Automatic Computer.

(“JOHNNIAC”, Wikipedia)

It looks interesting that, of the various early electronic computers with different names, the one named after “the father of computers” John von Neumann also had a name ending in “NIAC”, similar to “niak”, just like the first one von Neumann was involved in developing.

Was there some sort of pattern?

There were only 3 other early electronic computers that had names ending in “NIAC”: MANIAC I, II & III. Their full names are: Mathematical Analyzer Numerical Integrator and Computer I, II & III.

(“List of vacuum tube computers”)

By inspecting the name acronyms, one can see that many end with “AC” because their names end with “and Computer” or “Automatic Computer”, whereas the few ending with “NIAC” also have “Numerical Integrator” ahead of it.

Like JOHNNIAC, MANIAC I was based on von Neumann’s computer design and the other two were subsequent improvements, MANIAC I & II built at the Los Alamos National Laboratory in the 1950s, and MANIAC III built in the 1960s at the University of Chicago.

(“MANIAC I”, “MANIAC II”, and, “MANIAC III”, Wikipedia)

So these “NIAC”s all had influence from von Neumann in one way or another.

But the influence was more concrete. Besides von Neumann’s taking part in ENIAC’s development, and JOHNNIAC’s naming in his honor, the influence was also a result of von Neumann’s leading roles in both computer development and numerical computing for nuclear bomb development; when ENIAC was completed, von Neumann brought together his ENIAC colleagues and his colleagues at the Manhattan Project, and one of the latter, the physicist Nicholas Metropolis, took up computing on ENIAC for hydrogen bomb development, invented the Monte Carlo method of statistical computing, and then started the MANIAC computer project:

“n 1942 and 1943, Metropolis accepted an appointment as a research instructor at the University of Chicago, where he worked with James Franck. Franck was a Nobel Laureate in physics, having received the award with Gustav Hertz in 1925 for discovering the laws that governed the impact of an electron upon an atom.

In early 1943, Robert Oppenheimer convinced Metropolis to come to Los Alamos. His first assignment was to develop equations of state for materials at high temperatures, pressures, and densities.

During World War II, scientists at Los Alamos used slow, clanking, electromechanical calculators when designing the first atomic weapons. These calculators proved fragile, and soon Metropolis and Richard Feynman were spending some of their time repairing these calculators.

At the end of World War II, mathematician John von Neumann brought together the developers of the first electronic computer, known as ENIAC, and several Los Alamos scientists, Metropolis among them. It then fell upon Stanley Frankel and Metropolis to develop a problem for ENIAC to solve: in 1945, the two men had the computer run complex calculations involving the design of the first hydrogen bomb.

Metropolis returned to Chicago, where he continued to work with ENIAC. Using the germ of an idea conceived by Enrico Fermi some 15 years earlier, Metropolis in 1948 led a team that carried out a series of statistical calculations on ENIAC. These statistical calculations would become collectively known as the Monte Carlo method of calculation, which since then has helped address issues such as traffic flow, economic problems, and the development of nuclear weapons.

… The Mathematical Numerical Integrator and Computer—MANIAC for short—became operational on March 15, 1952. …”

(“The Metropolis Fellowship: Who Was Nick Metropolis?”, Issue 2, 2011, National Security Science, Los Alamos National Laboratory)

And so the “NIAC”s received such names because of these early electronic computers’ goal of enabling large-scale numerical computing, as in ENIAC and the MANIACs from “Numerical Integrator and Computer”, and JOHNNIAC from “Numerical Integrator and Automatic Computer”.

John von Neumann’s interest and expertise in numerical computing is what had brought him into the World War II Manhattan Project in the first place, and the need for greater computing power then brought him into the ENIAC project, here quoted again from the science historian Liesbeth De Mol but with more details than previously in Part 5 (i):

“Von Neumann got particularly interested in computers for doing numerical calculations in the context of theoretical physics and thus understood, quite early, that fast computing machines could be very useful in the context of applied mathematics.

In 1943, during World War II, von Neumann was invited to join the Manhattan project – the project to develop the atomic bomb – because of his work on fluid dynamics. He soon realized that the problems he was working on involved a lot of computational work which might take years to complete. He submitted a request for help, and in 1944 he was presented a list of people he could visit. He visited Howard Aiken and saw his Harvard Mark I (ASCC) calculator. He knew about the electromechanical relay computers of George Stibitz, and about the work by Jan Schilt at the Watson Scientific Computing Laboratory at Columbia University. These machines however were still relatively slow to solve the problems von Neumann was working on. But then he accidentally met Herman Goldstine at Aberdeen railwaystation. While waiting for their train on the platform, Goldstine told him about the top-secret ENIAC project at the Moore school [13]. Von Neumann got very excited, and Goldstine made arrangements (providing the necessary clearance document) so that von Neumann could visit the ENIAC. …”

(“Doing Mathematics on the ENIAC. Von Neumann’s and Lehmer’s different visions”, by Liesbeth De Mol, in E. Wilhelmus, I. Witzke (eds.), Mathematical practice and development throughout History, 2008, Logos Verlag)

Jerry Wozniak probably had nothing to do with numerical computation. But as reviewed earlier, he was an engineer working at various companies in Southern California, where the Cold-War think tank RAND Corporation and its JOHNNIAC was based, when he was recruited in 1958 to the Lockheed Missiles and Space Company founded in Northern California in 1956, that would become a top developer of nuclear missiles. Moreover, von Neumann who had died in February 1957 had become the U.S. Air Force’s “principal adviser” on nuclear weapons, including on intercontinental ballistic missiles (ICBM), as previously quoted in Part 5 (i):

“… The principal adviser to the U.S. Air Force on nuclear weapons, Von Neumann was the most influential scientific force behind the U.S. decision to embark on accelerated production of intercontinental ballistic missiles. …”

(“Passing of a Great Mind: John von Neumann, a Brilliant, Jovial Mathematician, Was a Prodigious Servant of Science and His Country”, by Clay Blair, Jr., February 25, 1957, Volume 42, Number 8, Life)

To be specific, what Lockheed Missiles and Space Company produced, as noted in Woz’s autobiography and in my earlier review of Howard Aiken’s Lockheed consultancy, was the more specialized Polaris nuclear missiles, submarine-launched ballistic missiles known as SLBM rather than ICBM:

Polaris missile, first U.S. submarine-launched ballistic missile (SLBM) and the mainstay of the British nuclear deterrent force during the 1970s and ’80s.”

(“Polaris missile”, Encyclopædia Britannica)

In any case, when the Los Angeles area-based Lockheed Corporation recruited the local Caltech’s former star football quarterback Jerry Wozniak to its new missiles company in Northern California in 1958, it could have been a pure coincidence that his name had “niak” like “NIAC”, but conceivably it could have occurred to the Lockheed management that taking an engineer with a name resembling those of computers closely associated with von Neumann, to the computer field in the missiles program, could be a modest ‘memorial rite’ in honor of the recently deceased “great man” – dubbed by the early computer science pioneer Louis Fain as in Part 5 (ii) – and a scientific leader of the computer and nuclear missile fields.

In this ‘memorial rite’ scenario, the fact that the 3 MANIC computers had been developed by von Neumann’s former colleague named “Metropolis” could be relevant because, in a similar word, the name Wozniak is ‘Polish’:

August 11, 1950 – Steve Wozniak (Born)

Steve Wozniak, also known as “Woz,” is a Polish American computer engineer who invented the Apple I and Apple II computers. His invention of the Apple personal computer led to the largest computer revolution in history.”

(“Museum’s Historic Reflections Project Part 2”, August/September 2014, The Polish American News, Polish American Cultural Center, Philadelphia)

I also note that Jerry Wozniak was hired by Lockheed in 1958 when Howard Aiken was already a Lockheed consultant and Aiken’s close friend Louis Ridenour was a top Lockheed executive, having been a director at its Missile System Division, as discussed earlier.

In other words, the former U.S. Air Force chief scientist Ridenour, a Caltech Ph.D. in physics, may have had a hand in the hiring of of a famous fellow Caltech alumnus, star football quarterback Jerry “Wozniak”, to become an engineer in computer technology in the new nuclear missiles program in Northern California, for which Aiken was a prominent consultant.

But then when Ridenour unexpectedly died in his sleep, in May 1959 in Washington, D.C. after an evening of drinking with Aiken as reviewed earlier, Aiken likely found that his clout and options suddenly diminished; in other words, starting “the first microcomputer computer company in the world” became harder without the help of his powerful and influential friend Louis Ridenour.

Subsequently, when Jerry Wozniak stayed in the secretive missiles program, i.e., did not get to join any company to develop the microcomputers as Aiken wanted to do, and then later his son became famous for inventing the personal computer, the little ‘memorial rite’, if true, may have turned into a monstrous ‘secret ritual’.

It would have been a pretty good ‘memorial rite’ in the 1960s had Aiken started the “first microcomputer computer company in the world” with the help of Cuthbert Hurd and some Lockheed engineers, and with Jerry Wozniak in a significant engineering role: besides giving some publicity to a “Wozniak”, it would have satisfied Aiken’s desire for leading the development of a new generation of electronic computers – he had been beaten by von Neumann at developing the first generation – and doing so with a “niak”, like “NIAC” as in ENIAC, working under him.

But as extensively reviewed earlier, despite Aiken’s desire such a new computer company was never launched, probably because Hurd was either intellectually empathetic of von Neumann and dismissive of Aiken’s interest in getting rich, or could not agree to ownership terms with Aiken, or because IBM, where Hurd had been a key executive in computer development, was not in favor of a new Aiken computer venture as bad feelings from IBM’s 1940s collaboration with Harvard and Aiken on the Mark I continued to persist.

As I have commented earlier, that both the playboy businessman and influential IBM board director Sherman Fairchild and the Nobel Prize laureate William Shockley got to start new Silicon Valley companies but the Harvard-retired, business goal-driven Howard Aiken could not, did not seem fair to the legendary, albeit conservative, early computer pioneer.

Fortunately, while still in Southern California Jerry Wozniak had begun tutoring his 6-year-old son “Woz” on electronics, and by the 1960s when Aiken and Hurd did not go through with their plan to start a new computer company, the teenage Woz was well on his way to growing into a prolific amateur computer designer.

Along the way, Fairchild Semiconductor gave Woz significant help, the U.S. Air Force gave him an award that critically boosted the young boy’s confidence, and the future would look bright for the next generation of “Wozniak”.

Then in 1970 when Aiken and Hurd again talked about starting a new computer company, this time to develop “a microprocessor and personal computer”, in that same year a 19-to-20-year-old Steve Wozniak already did his own building of a small computer, the “Cream Soda computer” – without the microprocessor as the latter was only being invented by Intel, as discussed earlier.

From that point on, technologically speaking, what Woz needed to develop the world’s first personal computer, that would become popular and commercially successful, was the arrival of the microprocessor plus good dynamic RAM, as earlier cited from him.

But before that became reality, in early 1973 Steve Wozniak moved an important step forward when he got his “dream job” designing calculators at Hewlett-Packard, as he later recalled:

“Now, finally, there was a time in my life—a time right after that third year at Berkeley—that I finally got my dream job. But it wasn’t a job building computers. It was a job designing calculators at Hewlett-Packard. And I really thought I would spend the rest of my life there. That place was just the most perfect company.

This was January of 1973, and for an engineer like me, there was no better place to work in the world. Unlike a lot of technology companies, Hewlett-Packard wasn’t totally run by marketing people. It really respected its engineers. And that made sense, because this was a company that had made engineering tools for years—meters, oscilloscopes, power supplies, testers of all types, even medical equipment. It made all the things engineers actually used, and it was a company driven by engineers on the inside so far as what engineers on the outside needed. Man, I loved that.

… the HP 35 was the first scientific calculator, and it was the first in history that you could actually hold in your hand. …”

(Steve Wozniak with Gina Smith, 2006, W. W. Norton & Company)

After in his earlier years receiving, through his father’s connection, help from Silicon Valley’s technologically and culturally trendsetting company Fairchild Semiconductor, in January 1973 Woz officially became a designer at the Silicon Valley-founding Hewlett-Packard, and so was now ready for Silicon Valley – except that the company he wanted to stay in for life would repeatedly turn down his requests to develop the personal computer there, and it was his friend Steve Jobs on the outside who persuade his family to convince him to do it with Jobs, on their own.

Two months after Wozniak’s hiring by HP, on March 14, 1973 Howard Aiken died in his sleep at a hotel in St. Louis, Missouri, as reviewed, during a consulting trip at Monsanto, having just quit consulting for Lockheed but continuing to persist in his quest to make computers smaller.

Recall that in Part 5 (ii), a critical interview with the early computer science pioneer Louis Fein, conducted by Pamela McCorduck on May 9, 1979, has been extensively quoted and reviewed, giving significant glimpses into academic politics in the early years of the emergence of computer science as an academic discipline, from the mid-1950s to the early 1960s.

I met Pamela in the mid-1980s after becoming acquainted with her husband Joseph Traub, who had in that earlier year, namely 1979, founded the computer science department at Columbia University.

In around 1990, Joe told me that Caltech gave him “one of their Fairchild Scholars”, i.e., a Sherman Fairchild Distinguished Scholarship for an academic visit and a short period of stay at Caltech.

Now reading a Caltech brochure I notice that, surprise, a Sherman Fairchild Foundation grant started its Fairchild Scholars program in 1973 – the year Steve Wozniak got his dream job at Hewlett-Packard and Howard Aiken died – and the scholarship was the original idea of Caltech professor Francis Clauser – someone with a same name as Woz’s father Francis Jacob Wozniak:

“This program was established back in 1973 by the gift of $7.5 million from the Sherman Fairchild Foundation. It was named in honor of the founder of the Fairchild Camera and Instrument Corporation and of Fairchild Industries, a man who would himself have been an ideal Fairchild Scholar. He was a pioneer – and an indefatigable inventor – in the fields of photography, aviation, and sound engineering.

Under the terms of the grant, the money was to be used over a period of ten years to underwrite the costs of visits to the Caltech campus of distinguished scholars or of young persons of outstanding promise from the worlds of academia, industry, and government. The appointments were to be made for periods ranging from a term to a year. Francis Clauser, Clark Blanchard Millikan Professor of Engineering, Emeritus, who originally suggested the idea, pointed out how much the members of the Caltech community would benefit from the opportunity to interact with the world’s intellectual leaders. And, of course, the sharing of wisdom and ideas would go both ways.”

(“The Fairchild Scholars Program”, 1981, Engineering & Science, Caltech Office of Public Relations)

In the timeline of this history, the erection of the Sherman Fairchild Distinguished Scholars program at Caltech brought together Jerry Wozniak’s alma mater and the inventor status of the businessman who had financed the start of Fairchild Semiconductor, subsequently a co-inventor of the integrated circuit, in the same year 1973 when Jerry’s son Woz became officially employed as a calculator designer at Hewlett-Packard.

This timeline was significant to the Wozniak family because: Fairchild Semiconductor had given Woz important help in his personal growth into an amateur computer designer, and Jerry, a former star football player at Caltech, had involvement in the early integrated circuit development – not at the place of original invention Fairchild Semiconductor but at the secretive Lockheed missiles program.

I hope in 1973 the newly interred Aiken did not take it as a slight by Sherman Fairchild – largest shareholder and powerful board member of IBM, a company excelling and leading in building computers even though its original collaborator, namely Howard Aiken on Mark I at Harvard, did not get to go further.

No, it couldn’t be a put-down by Sherman Fairchild, because he had passed away on March 28, 1971 – just 10 days short of the 75th birthday in comparison to Aiken’s 5 or 6 days past the 73rd birthday of March 8 or 9 as cited in Part 5 (ii).

(Frank and Suanne Woodring, 2007, Arcadia Publishing)

Still I would remark, in relation to an observation in Part 5 (ii) regarding my first arrival in America and the death of Diana Forsythe, that Sherman Fairchild had lived twice as many final March days as Howard Aiken did.

(Continuing to Next Part)

Leave a comment

Filed under Computer, Computing, History, Industry, Nobel Prize, Politics, Science, Technology

Leave a comment