← Browse

Interwoven frontiers: Energy, AI, and US-China competition

Summary

The future for energy and emerging technologies is as interwoven—and at times as fraught—as the U.S.-China relationship.

Full Text

The fates of breakthrough technologies and the energy to power them are deeply interwoven, with progress at scale in one difficult to advance without the other.

Nowhere is this interconnection more consequential—and more dynamic—than in the U.S.-China relationship, which is marked by intense political tensions and competition, but also by under-appreciated technical overlap.

Given the inescapably interconnected nature of digital and energy systems, and of U.S. and Chinese progress there-within, current and future U.S. administrations will need to consider both together, making conscious decisions about how technology investments may drive new frontiers in energy production, and how new energy innovations will be essential to progress in applied and cross-cutting technology priority areas like artificial intelligence (AI).

The race to develop, control, and power AI offers a particularly clear lens through which to observe these intertwined dynamics, as technological developments fuel competition, and where that competition shapes each country’s energy choices.

The policy moves Washington and Beijing make in prioritizing demand drivers like AI and in turn confronting the energy demand challenges AI creates; dealing with the security consequences of smarter grids; and investing in global market expanders like standardization all have strategic consequences for both countries’ competitiveness in energy and technology.

While the United States will face a range of choices when addressing the dual imperatives of maintaining energy security and driving technological innovation, most important to success will be recognizing the extent to which one depends on the other.

This dynamic is emblematic in AI, as this paper will explore, with similar dynamics likely to play out across technological innovations, and for energy security, on which so many sectors rely.

Putting into place a clear strategy—one that recognizes this interplay and harnesses the multipliers of private sector investment—will confer not just strategic advantage but could also offer platforms for multilateral and even bilateral cooperation, on the basis of mutual interest, as shared concerns about AI animate policy discussions in both capitals.

The AI demand driver

AI systems in their present form are notorious energy consumers, a demand burden potentially borne by the U.S. and Chinese grids that may strain long-standing projections of energy demand.

While AI by no means accounts for the bulk of growing energy demand globally, its significant needs, rapid emergence, and mass diffusion provide a window into a new and importantly disruptive dynamic to clean energy plans around the globe.

Straining demand.

Prior to the development of AI systems at scale, a relatively stable dynamic existed between the increasing need for energy-intensive data centers and the efficiency of the machines within those data centers.

As a result, in the decade before modern AI systems came of age (roughly 2005-2016), the energy being consumed by U.S. data centers was relatively flat, as efficiency counterbalanced the growing number of data centers.

However, as the world’s most data-intensive and data-center-operating firms like Amazon, Alphabet’s Google, and Meta’s Facebook began to seize the business benefits of advanced machine learning—the underlying science that would drive “generative” tools like ChatGPT in recent years—the necessary shift to particularly energy-intensive hardware led energy consumption to more than double in the five years that followed (2017-2023).

The demand spikes caused by AI take many forms.

Training advanced AI systems creates needs among single, large power consumers that may rapidly outstrip any ability to serve them, let alone with the clean energy they increasingly seek.

While estimates vary wildly, parsing available research and companies’ own statements suggests that training (which is to say building for consumer use) the current, cutting-edge models available today is estimated to have expended tens of megawatts per model.

The “scaling laws” 1 that many AI companies are using to project future demand and acquire energy capacity suggest a multi-gigawatt annual need in just a few years to develop and sustain next-generation systems.

This energy need would translate into one widely used “frontier” AI model’s training, consuming perhaps as much electricity as 5 million U.S. households in a year.

Increases in the efficiency of the chips used to train most AI systems, as well as in how those processors are used, could prevent a worst-case scenario of runaway consumption—but with longer “training runs” becoming the norm, those gains may have difficulty offsetting the growth in the computational intensity/duration of a training process itself.

Yet a less appreciated but also significant demand driver will be the cost of actually utilizing AI systems to perform functions, so-called “inference” (literally querying an AI model, whether for an answer to a text-based question, to seek the generation of an image, etc.).

What many would think of as a “standard” search engine inquiry, made instead with a large language model (LLM), is widely estimated to require 10 times the electricity to deliver a similar result—though, without providing details or baselines, some in the industry have claimed simple queries demand far less.

Yet models are capable of much more than just text generation, with other capabilities such as image or video generation demanding more inference per input, and in turn, increasing the energy used to generate an output.

Precise estimates of inference cost depend not just on the query’s type and complexity but on the number of “parameters” an underlying model (ChatGPT or Meta’s Llama) has.

Though the landscape changes daily, as of this writing, capable AI models range from just under 10 billion for the smallest, to tens to hundreds of billions for so-called “ multimodal ” models capable of handling and outputting images in addition to text, to (an estimated) more than 1 trillion for the largest such models.

Energy usage does not perfectly track the increase in parameters, but it does provide directional indications, with additional fluctuations based on factors like whether computation needs to be spread across multiple processors within data centers.

Increasingly, incurring that energy cost may also be inescapable for consumers, as companies embed LLM-driven results into even basic web searches.

As a result, which models become “default” for queries at a global scale (e. g., when an iPhone user does a web search in their phone’s browser) or even formally standardized (as discussed below), can matter immensely for aggregate AI energy consumption.

Enduring strategic focus.

Apivotal question is whether AI will remain of strategic significance, particularly in the U.S.-China relationship, in the decade-long timescale on which many energy decisions are made.

Based on current evidence, it is unlikely that the race to develop better, more capable AI systems will be diminished in the second Trump administration.

Despite some critics’ expectations of another “ AI winter ” or that these technologies will fail to deliver national-level economic impact, the second Trump administration’s personnel and policy choices give little indication that it anticipates a cooling of the issue in the context of foreign policy.

President Donald Trump’s appointment of several technology figures —some with significant experience working on AI in the first Trump administration —on the day-one White House team, combined with the creation of an elevated White House AI (and crypto) czar within the organizational hierarchy, suggests AI’s continuity as a high-tier policy concern.

In the administration’s opening weeks, it released an executive order stating “we can solidify our position as the global leader in AI,” which sought to review (but not wholesale stop the work of) President Joe Biden’s main AI order.

Days later, it devoted Vice President JD Vance’s first international speech entirely to the topic of AI.

Much was made of that speech’s calls to reconsider the prior administration’s safety-focused approach to AI.

Yet beyond the headlines, notable continuity prevailed.

On the characterization of the challenge, Vance pointed to how “hostile foreign adversaries have weaponized AI,” just as Biden focused on “use of AI systems by adversaries and other foreign actors.” And both drew attention to energy as a key vector of that AI competition, with last-minute moves by the Biden administration to try to accelerate the growth of energy infrastructure for U.S.

AI firms echoing Vance’s claims that “we stand now at the frontier of an AI industry that is hungry for reliable power.” Vance continued, “AI cannot take off unless the world builds the energy infrastructure to support it,” and he even opined that “it will help us create and store new modes of power in the future.” In the geopolitical competition for AI, energy is envisaged, across administrations, as critical.

In China, President Xi Jinping has consistently cited various constructions of how “China attaches great importance to the development of AI” and must “aim at the commanding heights of future science and technology and industrial innovation,” focusing in particular on “artificial intelligence” and “new energy.” While the tie between energy and AI development has not yet been stressed as explicitly in Beijing’s rhetoric as in Washington’s, the relationship between the two was cited in Xi’s calls for a “major strategic deployment for building a new power system” and the “empowerment of new technologies.” Beijing’s AI push, for its part, has continued despite China’s significant headwinds for the technology sector: namely, crackdowns on the technology sector in recent years.

Leading AI capabilities continue to emerge from researchers, startups, and established firms alike

...

Document ID: interwoven-frontiers-energy-ai-and-us-china-competition