Archived copies of the article:
Big article, but a great read! Some key excerpts:
This isnāt simply the norm of a digital world. Itās unique to AI, and a marked departure from Big Techās electricity appetite in the recent past. From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers. Given the direction AI is headedāmore personalized, able to reason and solve complex problems on our behalf, and everywhere we lookāitās likely that our AI footprint today is the smallest it will ever be. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households.
Letās say youāre running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise. Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram. Youād use about 2.9 kilowatt-hours of electricityāenough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.
One can do some very rough math to estimate the energy impact. In February the AI research firm Epoch AI published an estimate of how much energy is used for a single ChatGPT queryāan estimate that, as discussed, makes lots of assumptions that canāt be verified. Still, they calculated about 0.3 watt-hours, or 1,080 joules, per message. This falls in between our estimates for the smallest and largest Meta Llama models (and experts we consulted say that if anything, the real number is likely higher, not lower).
One billion of these every day for a year would mean over 109 gigawatt-hours of electricity, enough to power 10,400 US homes for a year. If we add images and imagine that generating each one requires as much energy as it does with our high-quality image models, itād mean an additional 35 gigawatt-hours, enough to power another 3,300 homes for a year. This is on top of the energy demands of OpenAIās other products, like video generators, and that for all the other AI companies and startups.
But hereās the problem: These estimates donāt capture the near future of how weāll use AI. In that future, we wonāt simply ping AI models with a question or two throughout the day, or have them generate a photo. Instead, leading labs are racing us toward a world where AI āagentsā perform tasks for us without our supervising their every move. We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode. We will give complex tasks to so-called āreasoning modelsā that work through tasks logically but have been found to require 43 times more energy for simple problems, or ādeep researchā models that spend hours creating reports for us. We will have AI models that are āpersonalizedā by training on our data and preferences.
By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. Thatās more than all electricity currently used by US data centers for all purposes; itās enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion milesāover 1,600 round trips to the sun from Earth.
the most useful takeaways are the current statistics, predictions about ai spreading because of demand are largely unfounded anymore. but even at current hype bubble usage itās an absurd amount of energy.
Quite the read. There are lots of unknowns with any technological development throughout history, and as the article points out, we donāt yet even know where we are on the energy demand curve from AI.
Something that confuses me is that geothermal is mentioned only once. These companies have the money to site datacenters near EGS plants or even build their own, grid connection optional, and have upfront capex sted power bills for the life of the center.
This would admittedly require long-term thinking, which shareholders are completely uninterested in when infrastructure investments ding their dividends and buybacks.