Red Cell: The Chimera of Technological Superiority

Why a New Offset Strategy Will Not Succeed

Red Cell #5
The pursuit of military dominance through technological superiority, amid rapidly diffusing dual-use technologies, is based on flawed and unproven assumptions

Media headlines and U.S. government reports both warn that China could soon surpass the United States as the world’s leader in critical emerging technologies — including artificial intelligence, quantum computing, robotics, and nanotechnology. To preserve American primacy, defense analysts argue that Washington needs to win the global technological race. But in an era of dual-use technological innovation, the pursuit of technological superiority is a chimera.

This approach is flawed for at least three reasons: First, technological superiority has seldom delivered a decisive military advantage in wars. Second, the global diffusion of technologies suggests that any advantage the United States might achieve will be fleeting, and any first-mover advantage will be small and expensive. Finally, the character of war is changing in ways that favor quantity over quality on the battlefield. To be sure, the United States needs to remain competitive with its rivals, but technology alone will not win future wars. Instead, military planners should focus on the hard work of developing a strategy and a set of operational concepts appropriate to the 21st-century environment.

The Red cell project

The Red Cell was a small unit created by the CIA after 9/11 to ensure the analytic failure of missing the attacks would never be repeated. It produced short briefs intended to spur out-of-the-box thinking on flawed assumptions and misperceptions about the world, encouraging alternative policy thinking. At another pivotal time of increasing uncertainty, this project is intended as an open-source version, using a similar format to question outmoded mental maps and “strategic empathy” to discern the motives and constraints of other global actors, enhancing the possibility of more effective strategies. 

Can the United States maintain military-technological superiority—the underpinning of enduring primacy—in a multipolar world where wealth, power, and technology are increasingly diffused? Or is this approach a chimera, based on the flawed logic that technological superiority is both decisive and attainable?

In recent years, a steady drumbeat of government and think-tank reports have sounded the alarm that U.S. dominance is fading, warning that China is harnessing emerging technologies such as robotics, artificial intelligence (AI), big data, nanotechnology, and advanced computing—the so-called Fourth Industrial Revolution—in a bid to “leapfrog” ahead of the U.S. military. As General Mark Milley, the chairman of the Joint Chiefs of Staff, wrote to Congress last year, China is “working every day to close the technological gap with the United States and our allies,” adding, “they intend to be a military peer of the U.S. by 2035.” Pitting the United States against China in a high-stakes race for technological supremacy, Michael Brown, director of the Pentagon’s Defense Innovation Unit, warned, “We need technological advantage to prevail in this strategic competition with China.”

From this perspective, growing threats to American power projection and all-domain military dominance are the result of Beijing’s closing the technological gap; those threats can therefore be reversed if the United States undertakes sustained and massive investments in emerging technologies while simultaneously denying China access to these same technologies. President Joe Biden’s ban on top-end AI and supercomputing chips and chipmaking equipment, choking off China, is a dramatic example of this logic. But this approach seems to frame the goal of achieving technological superiority as a strategy in itself rather than an instrument of it.

Rethinking the Offset Strategy

Many analysts want to return to a Silicon Valley version of what they conclude has worked in the past. Eric Schmidt, an AI tech investor and former head of Google, and Robert O. Work, a distinguished fellow at the Center for New American Security and former deputy secretary of defense, have made the most forceful case to date. They propose an ambitious new strategy—what they term the Offset-X strategy—which draws on Work’s vision of the “third” offset strategy during the Obama administration. To counter China, the Offset-X strategy envisions leveraging “emerging and disruptive technologies” and employing them “in ways that China will struggle to match or quickly duplicate.” This approach is based on the flawed logic that technological superiority is both decisive and attainable.

The Pentagon should reconsider the value of embarking on a quixotic quest for technological superiority, which is unlikely to preserve U.S. military dominance for at least three reasons:

First, the pursuit of technological superiority and an offset strategy is based on a set of untested assumptions about the relationship between technology and military effectiveness. When Work originally introduced the offset idea in 2014, proposing the “third” offset strategy, he emphasized the historical continuity between the need for a new offset and past successful efforts in the 1950s and 1970s, what he coined the “first” and “second” offsets, to substitute advanced technology for Soviet numerical superiority. The United States first used its technological advantage in strategic and tactical nuclear weapons and later leveraged cutting-edge technologies, specifically stealth, satellite-based communications, and precision-guided munitions, to enable its outnumbered forces to repel a Soviet conventional offensive without trying to match the Soviets tank-for-tank or person-for-person.

The underlying assumption of Offset-X is that these first two offsets were successful and therefore ought to serve as a template for addressing growing Chinese and Russian military threats today. The first two offsets (fortunately) never faced the ultimate test of a Soviet invasion of Western Europe. Therefore it is difficult to assess the extent to which a qualitative technological advantage might have succeeded in compensating for inferior numbers of U.S. and NATO combat forces. Whether the West’s technological edge would have been sufficient to defeat a Soviet conventional invasion of Western Europe remained an open question.

In the absence of a definitive test of the theory in the 1970s and 80s, advocates claimed that America’s lopsided victory in the 1991 Gulf War validated the offset model. Saddam Hussein’s Iraq was hardly the Soviet Union, but the swift success of U.S. precision-guided weapons got Beijing’s attention and incentivized its tech modernization efforts. In any case, the 100-hour Iraqi rout in 1991 was so overdetermined—US forces were substantially better trained and led than the Iraqi military—that drawing any definitive conclusions from the battlefield evidence would be specious.

Simply put, technology matters, but far less than the Pentagon believes.

More important, there are good reasons to doubt that technological superiority confers a significant military advantage. Contrary to popular belief, there is remarkably little historical evidence to support the proposition that better technology confers a decisive military advantage in wars. In Iraq and Afghanistan, the United States possessed overwhelming technological advantages, but translating that technological prowess into political success proved elusive. Of course, both wars are examples of counterinsurgencies rather than high-end conventional warfare, which currently dominates the Pentagon’s attention. Nevertheless, the evidence from great-power wars offers equal reasons for skepticism. In World War II, Nazi Germany led the field in technological developments, including tank design, jet aircraft, and cruise and ballistic missiles, but other factors—particularly superior numbers of Allied troops and weapons—mattered more. Simply put, technology matters, but far less than the Pentagon believes.

Second, today’s technological context is different from that of the first two offsets. As Audrey Kurth Cronin, distinguished professor of international security at American University, argues, these earlier offsets resulted from military-technological innovation that took place during periods of closed development, when government-funded weapons programs were walled off from the public. These programs mainly drove technological developments such as nuclear weapons, stealth fighter jets, or precision-guided munitions. These systems were expensive and difficult to build, and their use was mainly or exclusively limited to the defense sector, allowing governments to effectively control access through secret programs, security classifications, and restrictive copyrights.

Closed innovation made an offset strategy both feasible and highly desirable. According to the logic of the two earlier offsets, militaries with the most technologically advanced weapons should have a competitive advantage in battle and even wars, and they could attempt to prevent the transfer and proliferation of these military technologies. Indeed, U.S. military-technological advantages endured for decades before countries caught up to the United States in first- and second-offset technologies.

In an era of open innovation, when dual-use emerging technologies are being asymmetrically redistributed, the United States cannot rely on an offset strategy and expect to come out ahead.

Today, however, in an era of open innovation, when dual-use, emerging technologies are being asymmetrically redistributed, the United States cannot rely on an offset strategy and expect to come out ahead. In periods of open innovation, the commercial sector— rather than state-funded defense programs—drives technological progress. These technologies are cheap, easy to use, and designed for individuals and groups to adapt and employ. By combining “clusters of technologies together,” Kurth Cronin explains, users can “create new forms and uses, both good and bad—well beyond whatever their original inventors had in mind.” For example, Ukraine has used 3D printers to create fins for grenades, which it has then launched from commercial drones to destroy Russian positions. Open innovation thus levels the playing field and shifts development to commercial markets.

Today, private U.S. companies, such as Google, SpaceX, Amazon, and OpenAI, spend more on the research and development of new technologies than does the federal government. Commercial firms are also working hard to make these technologies both cheaper and more capable. The $52.7-billion CHIPS and Science Act and the $369-billion Inflation Reduction Act (IRA) will accelerate high-end semiconductor manufacturing, as well as renewable and electric vehicle technology and R&D more broadly. Instead of proliferating relatively slowly like nuclear weapons, stealth, and the precision capabilities of the first and second offsets, these technologies will diffuse quickly and widely through commercial processes and be repurposed for military applications. The widespread use of commercial drones and quadcopters on battlefields across the globe, from Syria to Ukraine, is a harbinger of things to come. In this age of open innovation, any technological advantage the United States might achieve will be fleeting, and any “first-mover advantage” will grow smaller and more expensive.

Finally, the character of war is indeed changing—in ways that promise to fundamentally alter the relationship between quality and quantity. Swarms of low-cost autonomous systems will bring a return of mass to the battlefield, undermining the effectiveness of an offset strategy based on qualitative technological superiority. In this operating environment, better technology will only be able to compensate for less mass to a point. To be sure, the United States needs to remain technologically competitive with its rivals, and by all accounts U.S. tech policy is seeking to maximize U.S. capacity in new and emerging technologies. At the same time, military planners need to recognize that innovation is neither predicable nor micromanageable.

Technology Is Not Strategy

No other nation places as much faith in technology as does the United States. The spirit of invention and discovery has powered America’s economic and geopolitical strength, and it is deeply embedded in U.S. military culture. But technology is not strategy.

Some will undoubtedly argue that the Pentagon should strive to maintain technological superiority, if only because China is doing the same in pursuit of its own military-technological edge. Nonetheless, general-purpose technologies like AI, which have a range of applications across industries, are not like a tank or fighter jet. These are “enabling technologies” akin to electricity or the steam engine, and any military advantage derived from them will be determined by how military organizations use them, rather by the access to those technologies alone. The United States did not lose the wars in Vietnam or Afghanistan because its technology was inferior.

The United States ought to remain competitive and invest in emerging technologies, as the administration’s CHIPS and IRA legislation seek to do, but the Pentagon wants to chase after the next “shiny object,” as if technology itself is strategy rather than a tool of national power. This technological fetishism only obscures the real challenge: shaping a strategy for a world in transition at a time of unprecedented technological change.

Photo: Department of Defense/Becky Vanshur.

Recent & Related

Policy Memo
Mathew Burrows • Robert A. Manning
Policy Memo
Chris O. Ògúnmọ́dẹdé

Subscription Options

* indicates required

Research Areas

Pivotal Places

Publications & Project Lists

38 North: News and Analysis on North Korea