|   
Follow us
Biz / Tech

AI is power hungry. Can new solutions, infrastructure sate the appetite and avert an electricity crunch?

Noah Gao
By some estimates, power consumption by global data centers will equal Japan's annual electricity usage by next year. Technology faces a critical challenge.
Noah Gao

Editor's note:

This article is the second of a two-part series examining how foundational infrastructure – particularly computing power and energy use – will shape the future of artificial intelligence.

AI is power hungry. Can new solutions, infrastructure sate the appetite and avert an electricity crunch?

Artificial intelligence has emerged as the crown jewel of modern technology. Yet behind the promise of smarter systems and automated efficiencies lies an escalating energy challenge that threatens both the environment and global power infrastructure.

In recent years, the data arms race has evolved into a full-blown electricity contest. Data centers, which power the AI revolution, are consuming energy at an unprecedented rate.

Is Microsoft Corp's apparent retreat this year in data center expansion plans telling us something?

The tech giant confirmed this week it's "slowing or pausing" some AI data center projects, including the US$1 billion project in Ohio.

Analysts have speculated that the retrenchment may reflect concerns about AI overcapacity and the vast amount of electricity that server farms consume.

According to the International Energy Agency, data centers worldwide consumed approximately 460 terawatt-hours of electricity in 2022, with the figure expected to hit 1,000 terawatt-hours by 2026 – a figure equal to Japan's entire annual electricity consumption.

In the US, data center electricity use could rise from 4.4 percent of national electricity use in 2023 to between 6.7 percent and 12 percent by 2028, according to McKinsey reports.

Europe faces a similar trend, with data center power demand projected to nearly triple by 2030. This escalation strains existing grids, potentially leading to instability and increased costs for everyday consumers.

AI is power hungry. Can new solutions, infrastructure sate the appetite and avert an electricity crunch?
CFP

The construction site for Ada Infrastructure's Docklands data center in London. Microsoft is reported to have held off on committing to the project.

The clock is ticking

One thing is apparent: The need for grid upgrades and sustainable energy solutions is now critical.

Training large language models, such as GPT-4, requires substantial energy. A single training session has been estimated to consume around 1.287 giga-watt hours, a figure that dwarfs the annual usage of about 200 average American households.

Over the past decade, chip power consumption has significantly increased. Modern graphics processors, like Nvidia's H100, can draw up to 700 watts per unit. When these GPUs are clustered in server racks, the collective power consumption can exceed 100 kilowatts. As chip performance continues to improve, the demands on power supplies and cooling systems escalate accordingly.

Ironically, while AI promises to optimize energy use across industries – potentially boosting efficiency by an estimated 20 percent through smart grid management and streamlined industrial processes – it is also contributing to a stark climate paradox.

Tech giants like Google have seen their emissions increase in tandem with AI integration across their product lines, complicating their efforts to achieve carbon neutrality by 2030.

Data centers demand extensive cooling systems that not only consume vast amounts of energy but also strain global freshwater resources. Cooling systems account for roughly 1.5 percent of the world's freshwater use. Innovations such as liquid cooling have reduced power use efficiency, yet they bring new risks in terms of chemical leaks and environmental contamination.

Diverse strategies

The surge in AI demands has forced data centers to adopt multifaceted strategies to slash power consumption.

In the hardware domain, GPUs are significant contributors to energy use, include high-end processors like Nvidia's A100 and H100.

Memory subsystem power consumption is also increasing rapidly as data volumes expand. US companies like Nvidia and Intel are leading research into dynamic voltage regulation. Intel's adaptive computing technology, for instance, offers significant power savings with a relatively small cost premium, providing a scalable solution across server farms.

AI is power hungry. Can new solutions, infrastructure sate the appetite and avert an electricity crunch?
CFP

GPUs are significant contributors to energy use, include high-end processors like Nvidia's A100 and H100.

Cooling systems remain a critical target for efficiency improvements. Traditional air cooling methods, responsible for a substantial portion of a data center's energy use, are increasingly giving way to liquid cooling innovations.

Liquid cooling capitalizes on the superior heat capacity and conductivity of fluids, achieving notable energy savings compared with conventional systems.

Chinese firm Inspur's pre-fabricated full-liquid cooling architecture, which reduces "power usage effectiveness" to as low as 1.05, illustrates this trend despite increased capital costs.

Data transmission is yet another realm where energy savings are achievable. Traditional long-haul power transmission can account for a portion of total energy loss. Edge computing, by relocating data processing closer to the source, has been shown to reduce these losses.

Optimizing applications

Parallel to infrastructure innovations, algorithmic and application-level optimizations are rapidly reshaping energy efficiency.

Deep learning models are evolving beyond brute-force scaling. Techniques such as "dynamic sparsification," as demonstrated by models like DeepSeek, substantially reduce energy consumption compared with dense models like GPT-4.

The so-called "mixture of experts" architecture offers another compelling case. By activating only the necessary "expert" subnetworks during training, it can potentially reduce training costs by up to 70 percent, relative to dense counterparts.

China's competitive edge in AI energy infrastructure is further reinforced by its robust hardware supply chains.

With a very large percentage of global photovoltaic capacity, rare earth resources and lithium battery production, Chinese manufacturers command essential components that underpin both renewable energy systems and advanced computing hardware.

Tech giants are retooling their energy strategies to support a new era of data-driven AI innovation. Confronted with soaring power demands, companies like Microsoft, Amazon, Google, OpenAI, Meta, Apple and Tesla are embracing three strategic pillars: a nuclear power renaissance, a reconfiguration of green energy assets and breakthroughs in energy storage and management.

Nuclear options

Microsoft, for instance, inked an agreement last September with Constellation Energy to revive the Three Mile Island facility in the US, securing over 800 megawatts of carbon-free power for its cloud and AI operations over the next two decades. Despite its troubled past, the restarted reactor symbolizes a commitment to reliable, low-carbon energy.

Similarly, Amazon is betting on next-generation nuclear by investing in X-energy's advanced reactors, designed to provide efficient and safe power for its data centers. Google, meanwhile, has partnered with Kairos Power to procure 500 megawatts from small modular reactors.

OpenAI is also joining the nuclear push by funding Oklo's thorium-based molten salt reactor technology, which promises enhanced safety and sustainability, while generating significantly less nuclear waste.

In parallel with nuclear initiatives, tech giants are overhauling their renewable energy portfolios. Google has laid out an ambitious plan to develop a 200-gigawatt global solar matrix that will not only power its data centers but also reduce millions of tons of carbon emissions annually.

Meta has already achieved 100 percent wind power for its Nordic data centers, capitalizing on the region's abundant wind resources and sophisticated grid management systems.

Apple has taken a financial approach by issuing US$1.3 billion in green bonds to fund renewable projects, driving its supply chain toward carbon neutrality. And Tesla at its gigafactories is integrating solar panels and advanced energy storage systems.

Existing energy infrastructure

Meanwhile, Chinese companies seem to be more interested in leveraging existing energy infrastructure than in building new power generation facilities. China has been the world's largest electricity producer since 2011.

At sites like Zhangbei in northwestern China's Hebei Province, massive wind turbines generate renewable electricity that powers data centers 150 kilometers away, supporting Alibaba and other major Internet firms. These data centers, operated by Chindata Group, primarily run on green energy.

In energy-rich regions such as Hebei and the Inner Mongolia Autonomous Region, wind and solar resources are increasingly integrated with advanced storage systems, ensuring a stable power supply and lower costs. New technologies, from lithium battery storage to AI-driven monitoring, are further strengthening grid reliability.

AI is power hungry. Can new solutions, infrastructure sate the appetite and avert an electricity crunch?
CFP

A drone protector is used by State Grid Corporation of China.

China is redefining energy infrastructure to power its burgeoning AI industry by combining vast transmission networks, innovative data center realignment and nuclear advancements.

The strategy is centered around key initiatives that link power-rich sources in western areas of the country to computing centers in the east.

For example, hydropower generated in southwestern China's Yunnan Province is efficiently being dispatched to meet the energy demands of Guangdong Province in southeastern China.

The strategy also calls for shifting some high power-use data sites closer to sources of energy-rich areas. In these western hubs, a high percentage of the energy mix comes from wind and solar power.

Complementing these grid innovations, China is advancing nuclear energy. The commercial operation of the Hualong 1 reactor, complete with entirely indigenous technology, signals a major leap forward.

Meanwhile, the emerging Linglong 1 small modular reactor, now in its commissioning phase, promises flexible, distributed energy support ideal for data centers.

Competitive edge

China's competitive edge lies in its agile regulatory and industrial ecosystem.

Data center approval times in China can often be faster than in the United States.

Moreover, China wields formidable control over key segments of the energy supply chain. It produces a very large portion of the world's solar panels, controls a very large portion of rare earth resources and accounts for a very large portion of global lithium battery output.

In tandem with these infrastructural strengths, China is achieving breakthrough performance in AI hardware. One example is Huawei's "Pangu" large-scale model, which significantly lowers energy consumption.

Yet challenges persist. One is advanced chip technology, vital for AI efficiency.

Domestic semiconductor firms lag behind global leaders like Nvidia, posing a bottleneck for AI computing power. While China has made strides in energy storage, innovations in next-generation battery technologies still trail international benchmarks.

China's blueprint may not be a one-size-fits-all solution, but it certainly lights the way forward.

AI is significantly transforming the energy industry. Machine learning optimizes processes in mining and oil prospecting, exploration and extraction. Smart grids utilize AI to adjust supply to match demand. AI-driven analytics are improving energy consumption patterns, reducing energy waste across various industries.

China's AI energy strategy is ambitious, but sustained leadership is not guaranteed. Overcoming technological dependencies, fostering effective global partnerships and maintaining innovation momentum will be crucial for its position in the evolving landscape of the AI-energy nexus.

AI is power hungry. Can new solutions, infrastructure sate the appetite and avert an electricity crunch?

Solar panels at a fishing village in the southwestern Guangxi Zhuang Autonomous Region. China has been the world's largest electricity producer since 2011.

(The author is founder of WisePromise, a boutique advisory agency specializing in the international expansion of Chinese tech companies in the advanced hardware and energy sectors. He also serves as a geo-economic expert for several think tanks in Beijing.)


Special Reports