Taming the Power Beast: AI Efficiency vs. Data Center Growth
- Published | 20 March 2024
The ever-growing appetite of Artificial Intelligence (AI) for data processing is driving a boom in data centers, raising concerns about their surging energy consumption and environmental impact.
Nearly all the world’s internet traffic travels through data centres. These are large, increasingly “hyper-scale” buildings that house computing machines and related equipment. If we want to understand how increased computing demand will impact electricity demand, data centres seem like an important place to start.
The increasingly sophisticated AI services in the hyperscale, public cloud data center providers mean power requirements in data centers are likely to rocket in the coming years. While the hyperscale data center typically need 10-14kW per rack in existing data centers, this is likely to rise to 40-60kW for AI-ready racks equipped with resource-hungry GPUs. This means that overall consumption of data centers across the US is likely to reach 35GW by 2030, up from 17GW in 2022.
Role of AI and ML in evolution of Data Center
The AI and ML adoption require large data center, high computing power, storage, GPU, hardware, more power, and colling system than traditional data center. In Northern Virginia, the largest data center market in the world at 3,400MW, availability is running at just 0.2 percent. Upcoming developments in the state include a 72MW campus in Leesburg, which is being built by Stack Infrastructure which is announced in December 2023, it is due to start coming online in the first half of 2025. The Bay Area around San Francisco is running at 0.5 percent of capacity, while availability in Dallas Forth Worth is 1.9 percent and in Phoenix, Arizona, it’s 3.8 percent. The year to the end of September 2023 saw sales worth $1.2bn recorded in the US, down 46 percent year-on-year.
“Fundamentally, supporting accelerating AI/ML adoption requires more power and cooling than much of the existing data center inventory can accommodate,” the report said. “Not all existing data centers lend themselves to retrofitting, catalyzing demand for new product in both existing and emerging markets.”
Huge, popular models like ChatGPT signal a trend of large-scale AI, boosting some forecasts that predict data centers could draw up to 21% of the world's electricity supply by 2030.
The Lincoln Laboratory Supercomputing Center is developing techniques to help data centers reel in energy use. Their techniques range from simple but effective changes, like power-capping hardware, to adopting novel tools that can stop AI training early on. Crucially, they have found that these techniques have a minimal impact on model performance. In the wider picture, their work is mobilizing green-computing research and promoting a culture of transparency.
Training an AI model- Training an AI Model is the process by which it learns patterns from huge datasets which requires using graphics processing units (GPUs), which are power-hungry hardware. As one example, the GPUs that trained GPT-3 (the precursor to ChatGPT) are estimated to have consumed 1,300 megawatt-hours of electricity, roughly equal to that used by 1,450 average U.S. households per month.
Training is just one part of an AI model's emissions. The largest contributor to emissions over time is model inference, or the process of running the model live, like when a user chats with ChatGPT. To respond quickly, these models use redundant hardware, running all the time, waiting for a user to ask a question.
The Northeastern University team created an optimizer that matches a model with the most carbon-efficient mix of hardware, such as high-power GPUs for the computationally intense parts of inference and low-power central processing units (CPUs) for the less-demanding aspects. Using this optimizer can decrease energy use by 10% to 20% while still meeting the same "quality-of-service target" (how quickly the model can respond).
This tool is especially helpful for cloud customers, who lease systems from data centers and must select hardware from among thousands of options. AI computer servers being installed in data centers are often equipped with multiple GPUs, usually supplied by Nvidia. Each GPU consumes up to about 400 watts of power, so one AI server can consume 2 kilowatts. A regular cloud server uses 300 to 500 watts. The power consumption of Nvidia's new GH200 server cluster is about two to four times more than a regular cluster of the same physical size, he estimated.
The US hosts a third of the world's 8,000 data centers and their energy consumption is growing significantly, due in no small part to the increased demands of AI. Tech companies like Microsoft, Alphabet, and Amazon are under increasing pressure to play a more active role in generating renewable energy and working on energy efficiency measures to keep their data centers running.
Move Towards Sustainability
The move towards sustainability is happening — Microsoft is buying renewable energy and taking other steps to meet its goal of being carbon negative by 2030 and further aims to power all facilities with 100% renewable energy by 2025. AWS cloud division is also shifting to renewable energy sources, from diesel to hydrotreated vegetable oil (HVO) to fuel its backup power generators for its data centers in Europe. The advantage of HVO as a fuel is that it doesn’t require any modification to the generators and remains stable in cold winter temperatures: no need for operational changes, and it can be used across regions. While AI is drawing attention to the way that computing happens globally, it’s a good time to consider the energy usage of data centers, and how to bring them into the cloud-based future.
Immersion Cooling System for AI Data Center
AI automatically controls data center equipment to ensure consumption efficiency and reduce power expenditure. Further, it manages the data center power balance, combating cooling and performance degradation by positioning workloads in just the right, most cost-effective energy zones. Moreover, AI-based security setups are able to analyze potential security threats and prevent malicious attacks.
Data centers have taken a lot of heat because of their increased appetite for power consumption and negative environmental profile. It’s a cause of concern for data center operators, but one that immersion cooling can address on many levels.
GRC’s patented immersion cooling system can easily handle the demands of high performance, high density deployments. It also delivers vastly superior cooling capabilities with greatly reduced power consumption compared with air cooling systems.
GRC’s solution allows you to put more computing in the rack and save valuable space in your data center. Moreover, GRC’s single-phase immersion cooling system has a simplified design that helps you eliminate the operational and maintenance costs of complex components, such as chillers, air handlers, and humidity control systems.
Conclusion
Almost every industry is now looking for new AI functionality that can streamline processes and improve results. In this new digital landscape, data centers are uniquely positioned to both provide and benefit from AI applications.
Training and delivering AI requires enormous amounts of computing power and data storage. Both future and traditional data centers will provide these functionalities as the backbone of a tech-driven world. But this increased demand also requires that data centers themselves use new technologies—such as AI systems—to provide a more effective, secure, and efficient service.
Contact Us:
BlueWeave Consulting & Research Pvt. Ltd.
+1 866 658 6826 | +1 425 320 4776 | +44 1865 60 0662
Related Blog
- Prebiotics consumption expands in food and beverage applications...
- Online education industry – How has COVID-19 impacted the growth dynamics
- Lithium ion batteries continue to gain phenomenal traction in the automotive sector...
- Personal Protective Equipment Industry – 6 Major Application Sectors Driving...