the shocking truth behind AI’s energy crisis — and what’s being done to solve it!

Artificial Intelligence is everywhere now, bringing incredible advancements and innovative solutions across industries. But there’s a catch: AI’s rapid growth demands unprecedented levels of energy, placing a massive burden on data centers worldwide. Governments and locals are also starting to worry: as more businesses integrate AI applications, the energy needed to power and cool these systems is set to skyrocket. It’s becoming urgent to address these needs moving forward. Let’s explore the predictions, how this impacts companies, and possible solutions for the future. Without further ado, let’s dive in!

power-hungry AI

AI workloads are extremely resource-intensive, particularly as models like ChatGPT and large language models (LLMs) become more complex and resource-hungry. For example, OpenAI’s ChatGPT consumes roughly 1 gigawatt-hour (GWh) of electricity per day – equivalent to the energy needed to power 33,000 homes in the US for a day, and a single query consumes on average 15 times more energy than a regular Google search. And that’s just one LLM; the demand for energy in data centers is expected to soar as more AI applications come online​. By 2030, AI workloads are projected to account for nearly 70% of data center capacity according to McKinsey, further amplifying the strain on global energy grids​. With predictions that data centers will account for 15-25% of all new net European demand by 2030, finding sustainable solutions is an urgent matter.

Source : AI power: Expanding data center capacity to meet growing demand

Data centers not only consume large amounts of power for computation but also require substantial cooling to maintain optimal temperatures for hardware. Cooling can account for up to 40% of a data center’s total energy use, and AI’s intensive tasks only exacerbate this issue. Companies are exploring more efficient cooling methods, such as liquid cooling, which uses less energy than traditional air cooling, and are deploying AI to optimize energy consumption and improve energy efficiency.

The rapid rise in energy consumption is throwing a wrench into the climate goals that major tech companies had set before the AI boom. In 2020, Google committed to running entirely on carbon-free energy around the clock by 2030, while Microsoft pledged to become carbon-negative within the same timeframe.

However, Microsoft has since faced setbacks. Last year, its greenhouse gas emissions spiked by 30%, primarily driven by its ambitious push into AI. As Microsoft President Brad Smith put it, “In 2020, we unveiled what we called our carbon moonshot. That was before the explosion in artificial intelligence. So, in many ways, the moon is five times as far away as it was in 2020 if you think of our forecast for the expansion of AI and its electrical needs.”

Countries are also stepping in by promoting policies encouraging the use of green energy in data centers. In some regions, such as Northern Europe, data centers are increasingly powered by wind and solar energy. Governments are also incentivizing the use of energy-efficient technologies through tax breaks and subsidies for companies that prioritize sustainability in their data center operations.

​energy-efficient solutions for the future

Tech companies are actively exploring solutions to improve efficiency and reduce their carbon footprints. AI proponents believe their technology could help tackle climate change by addressing inefficiencies. For instance, Google reported in 2016 that its DeepMind AI reduced its data center cooling energy usage by up to 40%.

Additionally, NVIDIA has introduced new GPUs that are 25 times more energy-efficient than previous models.

Another solution being explored is small nuclear reactors (SMRs), ranging from about 1/100 to ⅓  of the capacity of their traditional counterparts. They offer a clean and modular way to power data centers: “It’s not just building one large reactor. You can put multiple units on one site to get to scale. Think of them like Lego blocks,” said Brian Gitt, head of business development at Oklo, a startup building SMRs. Overall, the IAEA estimates that about 80 commercial SMR designs are in development worldwide.

However, some experts caution that these gains might be offset by the Jevons Paradox, where increased efficiency leads to greater overall consumption of the resource. Industry leaders are also pushing for renewable energy solutions, but the transition is slow as many new data centers still rely on fossil fuels. The inconsistency of renewable sources for high-speed computing needs poses another challenge.

To tackle these issues, regulations are being considered globally, with Singapore and the European Union moving towards sustainability standards for data centers. In the U.S., legislative efforts are also underway, such as a bill introduced by Senator Ed Markey to study AI’s environmental impact and a House hearing on AI’s energy usage earlier this year. This means companies will have to adapt and find sustainable solutions sooner rather than later.

conclusion

As AI continues to evolve, so too will its demand for computing power and energy. For data centers and MSPs, this presents a unique opportunity to innovate and find efficient, sustainable ways to handle growing workloads. While AI’s computational and energy needs can seem overwhelming, the integration of circular IT solutions, innovative scaling practices, and renewable energy sources can help alleviate some of this pressure. By proactively addressing these challenges now, businesses can not only meet AI’s future demands but also position themselves as leaders in sustainability and innovation. In this rapidly changing ecosystem, adapting is not just a choice; it’s a necessity. Feel free to contact us for any advice or needs regarding your infrastructure; we are always happy to help!

stay tuned! 

Enjoying this article? 
Follow expert insights,
industry trends, and more
in our quarterly newsletter! 

share with your network