Due to the booming development of artificial intelligence, new data centers are springing up like mushrooms after rain, and companies are building them at a rapid pace. This translates into a huge demand for power to run and cool the internal servos. There are now growing concerns about whether the U.S. will be able to generate enough electricity for widespread adoption of artificial intelligence, and whether our aging grid can handle the load.
“If we don’t start thinking about this power problem differently now, we’re never going to realize this dream,” said Dipti Vachani, head of Arm’s automotive business. The chip company’s low-power processors are increasingly popular among hyperscale enterprises. Welcome e.g. Google, Microsoft Oracle and Amazon ——Precisely because they can Reduce electricity usage by up to 15% In the data center.
NvidiaThe latest AI chip, Grace Blackwell, uses an Arm-based CPU, which is said to be able to run generative AI models with 25 times lower power consumption than the previous generation.
“Saving every bit of power would be a fundamentally different design than trying to maximize performance,” Vachani said.
This strategy of reducing power consumption by increasing computational efficiency, often referred to as “more jobs per watt,” is one answer to the AI energy crisis. But this is not enough.
A ChatGPT query consumes nearly 10 times the energy of a typical Google search Report by Goldman Sachs. Generating AI images can be used As much power as charging a smartphone.
This problem is not new. 2019 Estimated Discovery Training a Large Language Model Produces as much CO2 as five gasoline-powered cars over their lifetime.
Hyperscalers, which build data centers to accommodate such massive power consumption, are also seeing emissions spike. Google’s latest environmental protection Report The company said greenhouse gas emissions increased nearly 50% from 2019 to 2023, in part due to data center energy consumption, although it also said its data centers are 1.8 times more energy efficient than typical data centers. Microsoft’s emissions From 2020 to 2024, there was a growth of nearly 30%, partly due to data centers.
In Kansas City, where Meta is building an AI-focused data center, power demand is so high that Plans to close coal-fired power plants put on hold.
On July 8, 2024, hundreds of Ethernet cables connected server racks at the Vantage data center in Santa Clara, California.
Katie Tarasoff
Chasing power
There are more than 8,000 data centers worldwide, the highest concentration of which is in the United States, and thanks to artificial intelligence, this number will increase significantly by the end of this century. Boston Consulting Group estimate By 2030, data center demand will grow by 15%-20% annually and is expected to account for 16% of total U.S. electricity consumption. This is up from 2.5% before the release of OpenAI’s ChatGPT in 2022, and is equivalent to the power used by about two-thirds of the population of total U.S. housing stock.
CNBC visited a data center in Silicon Valley to learn how the industry is coping with this rapid growth and where enough power can be found to make it happen.
“We suspect that the demand for artificial intelligence-specific applications will be as much or more than the demand for cloud computing historically,” said Jeff Tench, executive vice president of North America and Asia Pacific at Vantage Data Center.
Many large technology companies contract with companies like Vantage to house their servers. Vantage’s data centers typically use more than 64 megawatts of power, the equivalent of tens of thousands of homes, Tench said.
“Many of these are occupied by a single customer who will lease the entire space to them. When we think about artificial intelligence applications, those numbers can grow significantly, into the hundreds of megawatts,” Tench said.
CNBC visited Vantage in Santa Clara, California, which has long been one of the hot spots for data center clusters in the country, close to customers with heavy data needs. Nvidia’s headquarters can be seen from the rooftop. Tench said power supplies in Northern California are “slowing” due to a “lack of power from utilities in the area.”
Vantage is building new campuses in Ohio, Texas and Georgia.
“The industry itself is looking for places where it has immediate access to renewable energy (wind or solar) and other infrastructure that can take advantage of it, whether it’s part of an incentive program to convert coal plants to natural gas, or increasingly looking to source natural gas from The way nuclear facilities get power,” Tench said.
Vantage Data Centers is expanding a campus outside Phoenix, Arizona, to provide 176 megawatts of capacity
Advantage Data Center
Strengthen the grid
Even if it can generate enough power, aging grids often cannot cope with the load. The bottleneck occurs in the process of transporting electricity from the generating station to where it is consumed. One solution is to add hundreds or thousands of miles of transmission lines.
“It’s very expensive and time-consuming, and sometimes the cost is passed on to residents because of rising utility bills,” said Shaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside.
A $5.2 billion effort to expand lines to an area of Virginia known as “Data Center Alley” met with opposition from local taxpayers who don’t want to see their bills increase to fund the project.
Another solution is to use predictive software to reduce failures in one of the weakest links in the grid: transformers.
“All the electricity generated has to go through the transformer,” says Vian Technology CEO Rahul Chaturvedi added that there are 60 million to 80 million such people in the United States
Transformers also have an average lifespan of 38 years, so they are a common cause of power outages. Replacing them is expensive and slow. VIE makes a small sensor that attaches to transformers to predict failures and determine which transformers can handle more load so it can be moved away from transformers at risk of failure.
Chaturvedi said business volume has tripled since the launch of ChatGPT in 2022 and is expected to double or triple again next year.
VIE Technologies CEO Rahul Chaturvedi holds up a sensor in San Diego on June 25, 2024. VIE installs these units on aging transformers to help predict and reduce grid failures.
Vian Technology
Cooling servers
It is predicted that by 2027, generative artificial intelligence data centers will need to draw 4.2 billion to 6.6 billion cubic meters of water to keep cool. Ren’s research. This is more than the total annual water withdrawals in half of the UK
“Everyone is worried that artificial intelligence will consume a lot of energy. When we stop being such idiots about nuclear issues, we can solve this problem, right? It can be solved. Water is the fundamental limiting factor for the future development of artificial intelligence ,”He said. Tom Ferguson, managing partner at Burnt Island Ventures.
Ren’s research team found that every 10-50 ChatGPT prompts depleted the contents of a standard 16-ounce water bottle.
Most of the water is used for evaporative cooling, but Vantage’s Santa Clara data center has large air conditioning units that cool the building without pumping water.
Another solution is to use liquid to cool the wafer directly.
“For many data centers, this requires a lot of modifications. In our case at Vantage, about six years ago we implemented a design that allowed us to utilize cold water circulation on the data hall floor,” says Vantage’s Tench.
company likes appleSamsung and Qualcomm Touts the benefits of on-device artificial intelligence, keeping power-hungry queries out of the cloud and away from power-strapped data centers.
“We’re going to have as much artificial intelligence as these data centers can support. It may be less than people desire. But ultimately, there are a lot of people working hard to find ways to lift some of these supply constraints,” Tench said.