December 27, 2024

A large corridor with supercomputers in the server room data center.

Studio Luza | Electronic+ | Getty Images

The boom in artificial intelligence is triggering an environmentally conscious shift in the way data centers operate, with European developers under pressure to lower water temperatures in their energy-hungry facilities to accommodate the higher-power chips of companies including tech giants NVIDIA.

Artificial intelligence is expected to drive Demand increased by 160% Goldman Sachs research shows that increased energy consumption in data centers could undermine Europe’s decarbonization goals by 2030, as specialized chips used by artificial intelligence companies are expected to increase energy use in the data centers where they are deployed.

High-performance chips, also known as graphics processing units or GPUs, are critical for training and deploying large language models, a type of artificial intelligence. These GPUs require high-density computing power and generate more heat, ultimately requiring colder water to support reliable cooling of the chips.

Andrey Korolenko, chief product and infrastructure officer at Nebius, said that artificial intelligence can consume 120 kilowatts of energy in just one square meter of data center, which is equivalent to the power consumption and heat dissipation of about 15 to 25 houses.

“It’s very intensive, and from a cooling perspective, you need different solutions,” he said.

The problem for us chipmakers is that AI is now a space race dominated by the U.S. market, where land rights, energy access and sustainability are relatively low on the priority list, and market dominance is key. Winterson told CNBC

Michael Winterson

EUDCA Chairman

Michael Winterson, president of the European Data Centers Association (EUDCA), warned that lower water temperatures would eventually “fundamentally return us to the unsustainable conditions of 25 years ago”.

“The problem we face with chipmakers is that artificial intelligence is now a space race dominated by the U.S. market, which has relatively low status for land rights, energy access and sustainability,” Winterson said. Dominance is key.

Major European equipment suppliers say U.S. chip designers are calling on them to lower water temperatures to accommodate hotter artificial intelligence chips, said Herbert Radlinger, managing director of NDC-GARBE.

“This was shocking news because initially everyone on the engineering side wanted to run higher temperatures with liquid cooling,” he told CNBC, referring to liquid cooling technology that is said to be cooler than more traditional method is more effective.

“Evolutionary Discussion”

Energy efficiency is high on the European Commission’s agenda, with a goal of reducing energy consumption by 11.7% by 2030. this number double or triple in some countries.

Winterson said lowering water temperatures was “fundamentally incompatible” with the EU’s recent energy efficiency directive, which creates a dedicated database for data centers of a certain size to publicly report their power consumption. EUDCA has been lobbying Brussels to consider these sustainability issues.

Energy management company Schneider Electric is in regular contact with the EU on this topic. Steven Carlini, a leading advocate for artificial intelligence and data centers and a vice president at Schneider Electric, said that much of the recent discussion has focused on different ways to obtain “primary power” for artificial intelligence data centers and how to integrate them with utilities. on the potential for more cooperation among business companies.

European Commission energy officials also spoke with Nvidia to discuss energy consumption and data center usage, including power usage efficiency and chipset usage efficiency.

CNBC has reached out to Nvidia and the committee for comment.

Schneider Electric CEO: From grid to chip, from chip to chiller

“Cooling is the second largest energy consumer in data centers after IT loads,” Carlini told CNBC in emailed comments. “Energy consumption will increase, but PUE (power usage effectiveness) may not increase as water temperatures decrease. And the chillers have to work harder.”

Schneider Electric customers deploying Nvidia Blackwell GB200 super chips require water temperatures of 20-24 degrees Celsius, or 68 to 75 degrees Fahrenheit, Carlini said.

That compares to about 32 degrees Celsius for liquid cooling, or Meta’s recommended hardware water supply temperature of about 30 degrees Celsius, he added.

Ferhan Gunen, Vice President, UK Data Center Operations Equinicstold CNBC that Equinix has been discussing some concerns about artificial intelligence with customers.

“They want to increase server density, that is, they want to have higher-power chips, or they want more servers,” she said, adding that the shift is not “clear-cut.”

“This is really a discussion about evolution,” Gurnan said.

Nvidia declined to comment on the cooling requirements of its chips, declare Earlier this year, Blackwell GPU launched a new platform. The company says the architecture will enable organizations to run on-the-fly generative artificial intelligence on large language models at up to 25 times lower cost and energy consumption than earlier technologies.

Liquid cooling requires “reconfiguration,” Gunen explained, adding that new data centers are already ready to adopt the technology. “Yes, higher density means more power consumption, which means more cooling needs. But technology is changing, so you do things differently. That’s why it all needs to be balanced, ” she said.

Vertiv CEO says data center liquid cooling is accelerating and is still accelerating

efficiency race

Nebius has about $2 billion in cash on its balance sheet following its spin-off from Russia’s Yandex, explain It will be one of the first companies to launch the Nvidia Blackwell platform to customers in 2025. Plans were also announced Invested over US$1 billion exist European artificial intelligence infrastructure will be completed by the middle of next year.

Nebius’ Korolenko said liquid cooling is the “first step” and that the cost of ownership will be worse initially and then improve over time.

“There’s a big push for delivery, but at the same time, when you scale up, you want to have the ability to choose, be economical without sacrificing too much. Power efficiency is important for running costs. It’s always a high priority,” Korolenko said.

Even before the surge in demand for artificial intelligence applications entered the market, Europe’s data center industry was struggling to keep pace with the growing digital industry.

Sicco Boomsma, managing director of ING’s TMT team, said market players are “very sensitive to electricity” and that Europe’s focus is on infrastructure, while the United States is more focused on expanding assets in Europe where electricity is available.

“There are also a large number of data center operators from the United States who are working together to ensure that their data center infrastructure meets various EU goals, such as carbon neutrality, for example, efficiency, regarding water use, maintaining biodiversity.”

“It’s a race to prove that their knowledge is leading to super-efficient infrastructure,” he said.

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *