DeepSeek called a net positive for data centers despite overcapacity worries


This audio is auto-generated. Please let us know if you have feedback.

The global race to develop sophisticated, versatile AI models is driving a surge in development of the physical infrastructure to support them. Microsoft, Amazon, Meta and Google parent Alphabet — four of the AI industry’s biggest players — plan to invest a combined $325 billion in capital expenditures in 2025, much of it for data centers and associated hardware. 

The technology giants’ spending plans mark a 46% increase from their combined 2024 capital investments, according to Yahoo! Finance, and track a doubling in proposed data centers’ average size from 150 gigawatts to 300 gigawatts between January 2023 and October 2024. In December, Meta said it would build a $10 billion, 4 million-square-foot data center in Louisiana that could consume the power output equivalent of two large nuclear reactors.

The planned spending announcements came after the January release of the highly efficient, open-source AI platform DeepSeek, which led some experts and investors to question just how much infrastructure — from high-performance chips to new power plants and electrical equipment — the AI boom actually requires. But experts interviewed by Facilities Dive see AI efficiency gains as a net positive for data center developers, operators and customers in the long run, despite shorter-term uncertainty that could temporarily create overcapacity in some markets.

“Every technology has gotten more efficient”

DeepSeek’s late January commercial launch took many Americans by surprise, but the rise of a more efficient and capable AI model follows a well-worn pattern of technology development, said K.R. Sridhar, founder, chairman and CEO of power solutions provider Bloom Energy.

“In 2010, it took about 1,500 watt-hours to facilitate the flow of one gigabyte of information,” Sridhar said. “Last year it took about one tenth of that … [but] the total growth of traffic, and therefore the chips and energy, has been twice that amount.” 

As generative AI model development has grown more capital intensive, the benefits of efficiency have increased as well, Morningstar DBRS Senior Vice President Victor Leung wrote in a Jan. 28 research note.

“DeepSeek’s approach to constructing and training its model could be viewed as manifesting this drive towards more efficient models and to be expected as they and other companies seek to reduce AI deployment costs,” Leung said. Continued efficiency improvements could moderate demand for data center capacity over the longer term, Leung added.

As evidenced by the tech giants’ recent announcements, the biggest AI players are unlikely to pull back on planned investments on DeepSeek’s account, Ashish Nadkarni, group vice president and general manager for worldwide infrastructure research at IDC, told Facilities Dive. 

For companies like Meta and Microsoft, the drive to build as much AI capacity as quickly as possible is “existential,” with Apple the only major tech company to buck the trend so far, Nadkarni said. Those companies, and others like Elon Musk’s xAI, are using huge amounts of proprietary data to train differentiated foundational models useful in building emerging “agentic AI” tools and other future innovations, he added.

Less training, more inference and a cooling rethink?

Despite the scale of their planned and in-progress AI investments, the big tech companies “remain at the starting line,” with commercial payoff uncertain and likely still years away, Nadkarni said.

Imminent advances that allow more cost-effective AI deployments could “stabilize the AI industry in the long run” by reducing the risk of a capital spending pullback that could lead to “a burst investment bubble in AI-related sectors,” Leung wrote.

A reduction in the computing power needed to train new models could eventually tip the balance of new data center deployments toward smaller “inference” data centers, which serve user requests rather than model development and refinement, experts say. That would mean proportionally fewer developments on the scale of Meta’s two-gigawatt Louisiana facility or Compass Datacenters’ planned campus in suburban Chicago that could eventually host five massive data centers.



Source link

About The Author

Scroll to Top