History offers a guide to winning our growing ‘flea war’ with China


In the midst of a semiconductor shortage that has caused hundreds of billions of dollars in economic damage, and an acceleration”flea war» between the United States and China on the future of semiconductor technology, the United States government is trying to strengthen support for the semiconductor industry through legislation such as the CHIPS Act recently adopted.

But for this effort to succeed, it must heed the lessons of history. Today’s push to help chipmakers isn’t the first time governments have poured money into semiconductor development. And the track record of past efforts indicates that it will be far more effective to focus on funding scientific research and development, providing a market for speculative technology, and ensuring that academics and start-ups have access to funds and manufacturing equipment to test new products than to try to support specific companies or technologies. These are the strategies that have propelled the US chip industry in the past, when more forceful interventions have often produced disappointing results.

The chip industry emerged from the Cold War arms race in the 1950s, when the Department of Defense sought miniaturized computing power for missile guidance computers. Even when it was the biggest customer, however, the US government had trouble predicting where business and technology trends were heading. Many agencies were more optimistic about an alternative model called “molecular electronics”, which quickly fell apart, while failing to see the promise in the integration of tiny circuits that led to today’s chips. In 1959, when small companies like Fairchild Semiconductors and Texas Instruments (TI) were making the first integrated circuits, a US military study visited 15 companies and research labs, including TI and Fairchild, but found no evidence that these two companies were in the market. about to pioneer a new industry.

Although the industry’s most crucial early innovations all met defense demand, many occurred outside of government-funded programs. Neither engineer has been credited with simultaneously inventing chips in the late 1950s while working at Fairchild and TI conducting research on government contracts, for example.

Public procurement helped not by dictating the development of specific technologies, but by setting priorities – miniaturizing computing power – and making it clear that the government was ready to buy almost anything that met that need. The first two major chip orders in the early 1960s were for guidance computers in the Apollo spacecraft and the Minuteman II missile. Unlike civilian customers, NASA and the Pentagon were willing to pay high prices for small-volume production runs, which accelerated the development of the chip industry.

Crucially, NASA and the Pentagon also held an open competition to source integrated circuits – a competition that included both tech giants and start-ups such as Fairchild. Existing electronics companies have consistently underperformed, delivering chips late or not at all. The decision to guide the Apollo spacecraft to the moon using Fairchild’s integrated circuits – an untested product made by an unknown company – reflected how the government relied not on authoritarian policy, but rather on clear performance goals, market competition, and a willingness to invest a large budget to build more accurate rockets.

The chip industry has grown beyond its beginnings as a niche defense business, primarily due to market forces. Startups like Fairchild were determined to bring their chips to consumer markets because they had no other way to grow. The military’s demand for guidance computers was limited, but consumer demand for computing power was already beginning to grow exponentially in the 1960s.

Fairchild founder Robert Noyce had started his career working on a defense contract at Philco – a major radio producer – during which he concluded that military research contracts were stifling the kind of innovation needed to develop consumer products. So, although his contract with the Apollo program helped get Fairchild off the ground, Noyce immediately turned to consumer markets. In 1968, 75% of the chips sold were used to produce civilian goods, from company computers to hearing aids.

Although chips were invented in the United States, by the end of the 1970s Silicon Valley faced new competition from Japanese rivals, prompting calls for government help. Japanese companies like Toshiba and NEC had learned to produce memory chips as advanced as those in Silicon Valley, but with lower prices and much lower defect rates. A study found that Japanese chipmakers averaged one-tenth the defects compared to a major US company.

As US companies lost market share, many analysts attributed the success to Japanese industrial policy. The US debate has focused on Japanese government support for corporate research and development (R&D) efforts, such as the VLSI program, which pooled R&D funds from the government and several large Japanese companies. Total expenditure on the VLSI program was low, roughly equivalent to the R&D budget of a major US chipmaker like TI. Nonetheless, the program loomed large in American thinking, and eventually prompted the US government to create a comparable government-backed research consortium called Sematech in 1987.

The government recruited Noyce, who had founded both Fairchild and Intel, to run Sematech. He focused the organization on supporting American semiconductor manufacturing equipment companies against their Japanese rivals. About half of Sematech’s budget in the late 1980s was spent on producing advanced lithography machines, a crucial type of chip-making tool that had been launched in the United States, but in the late 1980s , it was mainly produced by three companies in Japan and the Netherlands. Noyce considered the bailout of American lithography companies to be Sematech’s primary yardstick.

Yet his efforts have not prevented major US companies in the industry from going bankrupt or being taken over by foreign rivals, because without effective business models and sales capabilities, no amount of government support could save them.

Sematech’s other efforts to boost production of chip-making tools have had mixed results. For example, former executives of Applied Materials, the largest maker of semiconductor tools, say Sematech has had virtually no impact on their business.

Sematech’s greatest success came from coordinating “roadmaps,” in which major chipmakers, tool makers, chip design software companies, and other companies that made the necessary products chipmakers could align their plans to ensure that each new generation of chipmaker technology had the tools and software needed for mass production. This reflects the kinds of government programs that have had the greatest positive impact on the semiconductor industry: not authoritarian industrial policy, but programs marked by public-private partnership to identify technological challenges, followed by an agreement allowing private companies to find commercially viable ways to reach out to them.

The Pentagon’s Defense Advanced Projects Research Agency (DARPA) offered another example of this approach. Rather than trying to help commercial industry, DARPA projects provided an opportunity to turn new ideas into prototype products, tackling technical challenges that all chip companies faced. For example, in the late 1970s, DARPA identified that the increasing complexity of chips would soon make them impossible to design by hand. Having identified this bottleneck, it funded university research programs on automated design processes. The start-ups that emerged from these research programs eventually became the three companies that dominate chip design software today.

Similarly, DARPA also realized that the rising cost of manufacturing chips made it harder for academics to test new ideas, as the cost of each test chip increased. So DARPA backed a program that allowed researchers to use commercial chip-making facilities to make chips, increasing the amount of research and prototyping. These efforts have ensured that while cost pressures and subsidies from foreign governments have attracted new semiconductor manufacturing facilities overseas, the designs, software and machine tools needed to produce chips are still widely produced in the United States.

As federal and state governments once again pump funds into the chip industry, this history of industrial policy can serve as a guide to what would be most — and least — effective. Funding for workforce development, basic science, and pathways to turn ideas into prototypes are all policies that have helped build the US chip industry in the past. Heavier efforts to save specific companies or to bet on specific types of commercial technology, on the other hand, have not worked in the past and will not help America win the chip war today.

Previous JD Duarte offers insight into the best ways to measure the success of digital operations
Next Merrill launches tool to help investors find advisors

MENU

Back