As digital transformation across all sectors gains momentum and the rise of power-hungry AI applications, demand for data services is rapidly increasing globally.
According to the International Energy Agency, data centers account for about 1% of the world's electricity demand. Annual data center power consumption is expected to reach 35 gigawatts by 2030, up from 17 gigawatts in 2022, according to McKinsey.
As Mark Garner, senior vice president of Secure Power Europe at Schneider Electric, explained, AI is emerging as a transformative force, changing the way data is processed, analyzed and used.
“The AI market is predicted to reach a staggering USD 407 billion by 2027, and the technology continues to revolutionize many industries, with annual growth rates from 2023 to 2030 It is expected to be 37.3%,” he says.
“The AI market is poised for further growth thanks to the generative AI (Gen AI) boom. 97% of business owners say ChatGPT will help them streamline communications, generate website copy, translate information, and more.” While we believe it will benefit organizations, the rapid adoption will undoubtedly require greater investment and infrastructure for AI-powered solutions than ever before.”
Meeting the demands of this new AI-powered world comes with challenges.
“Data centers serve as critical infrastructure that supports the AI ecosystem,” Garner said. “AI requires a lot of power, but AI-powered data analytics can move data centers closer to net zero and play an active role in addressing sustainability challenges.”
Here, Garner explores four key AI characteristics and trends that underpin data center physical infrastructure challenges: power, racking, cooling, and software management.
How to deal with the rise of power-hungry AI applications
As Garner explains, power, cooling, racking, and physical infrastructure are core to a successful data center.
“Storing and processing data to train machine learning (ML) and large-scale language models (LLM) is steadily increasing energy consumption,” he says. “For example, researchers estimate that creating GPT-3 consumes 1,287 megawatt-hours of electricity and produces 552 tons of CO2, which is equivalent to driving 123 gasoline-powered cars for one year. In addition, data centers are employing higher density racks that can accommodate more servers in smaller spaces, further increasing power requirements.
“So how do we meet the growing power demands of AI while minimizing the impact on the planet? Data centers are continually evolving to meet the growing power demands of AI clusters. Improving power distribution systems and energy efficiency within data centers ensures that power is delivered to servers in the most efficient way possible with minimal losses. When designing and managing your center, you should focus on energy-efficient hardware and software while diversifying your power sources to provide the safe and abundant power needed for AI growth.
“By adding things like advanced power distribution units (PDUs), intelligent management, and highly efficient power systems along with renewable energy sources, data centers can reduce both energy costs and carbon emissions. However, AI training The extreme rack power density of servers can create additional issues beyond power consumption. For example, cooling can also pose complex challenges for operators.”