one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    78
    arrow-down
    6
    ·
    9 months ago

    And nobody seems to give a shit. Even people who would normally give a shit about this sort of thing. Even people who do things like denounce Bitcoin mining’s waste of energy (and I agree) are not talking about the energy- and water- waste from AI systems.

    That article says that OpenAI uses 6% of Des Moines’ water.

    Meanwhile-

    According to Colorado State University research, nearly half of the 204 freshwater basins they studied in the United States may not be able to meet the monthly water demand by 2071.

    https://abcnews.go.com/US/parts-america-water-crisis/story?id=98484121

    And nobody seems to give a shit.

    • Nudding@lemmy.world
      link
      fedilink
      arrow-up
      35
      arrow-down
      9
      ·
      9 months ago

      Lots of people give a shit, they’re just not in any sort of position to do anything about it.

      We won’t treat climate change seriously until we get a significant climate related mass casualty event in North America.

        • Flying Squid@lemmy.world
          link
          fedilink
          arrow-up
          17
          arrow-down
          2
          ·
          9 months ago

          Giving a shit about the horse barn after someone’s already let out all the horses doesn’t really make a difference.

        • tal@lemmy.today
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          9 months ago

          gulf streams collapse

          Nah, there were some people worried about it, but it won’t happen.

          https://en.wikipedia.org/wiki/Gulf_Stream

          The possibility of a Gulf Stream collapse has been covered by some news publications.[vague] The IPCC Sixth Assessment Report addressed this issue specifically, and found that based on model projections and theoretical understanding, the Gulf Stream will not shut down in a warming climate. While the Gulf Stream is expected to slow down as the Atlantic Meridional Overturning Circulation (AMOC) weakens, it will not collapse, even if the AMOC were to collapse. Nevertheless, this slowing down will have significant effects, including a rise in sea level along the North American coast, reduced precipitation in the midlatitudes, changing patterns of strong precipitation around Europe and the tropics, and stronger storms in the North Atlantic.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              9
              arrow-down
              1
              ·
              9 months ago

              Hey now. Rule one of this community is “Be civil - Attack the argument, not the person.” What does any of this genocide stuff have to do with the argument being discussed here?

              • Flying Squid@lemmy.world
                link
                fedilink
                arrow-up
                8
                arrow-down
                3
                ·
                9 months ago

                The discussion here? Nothing. The person trying to have a discussion with me? Everything. Because yesterday, he said that I support genocide both due to the fact that I will do anything to keep Trump out of office to save my queer daughter’s life and due the the fact that I exist. And now he wants to chat with me as if he never said that.

                No one did anything about it in this community when he said those things. And, believe it or not, the topic of discussion was not genocide.

                Why should I let someone who was that hateful towards me act like nothing was ever said a day later?

                • agent_flounder@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  8
                  arrow-down
                  1
                  ·
                  9 months ago

                  I’m with you on this 100% and not just because my own queer daughter’s life is on the line.

                  If we are gonna play the ‘be civil’ card, play it on everyone. Certainly don’t overlook people accusing others of genocide. You’d think that didn’t need to be said but shrug

                • FaceDeer@kbin.social
                  link
                  fedilink
                  arrow-up
                  5
                  arrow-down
                  2
                  ·
                  9 months ago

                  Because it’s against the sub’s rules to attack him personally.

                  Why not block him? You’ll never see him again.

              • agent_flounder@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                3
                ·
                9 months ago

                If you think the decision on who to vote for in the 2024 US Presidential election is morally black and white, and that the genocide of Palestinians is by far the most important issue, then we lack a common basis for any worthwhile discussion.

                Everything I have heard from you and others who bring up the same talking points suggests that is precisely the case.

    • bleistift2@feddit.de
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      9 months ago

      I guess it depends on how you use chatbots. If you’re just too lazy to click on the first google result you get, it’s wasteful to bother ChatGPT with your question. On the other hand, for complex topics, a single answer may save you quite a lot of googling and following links.

      • Flying Squid@lemmy.world
        link
        fedilink
        arrow-up
        12
        arrow-down
        6
        ·
        9 months ago

        Oh, well as long as it save you from Googling it’s okay that it’s a massive ecological disaster. My mistake.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          8
          arrow-down
          2
          ·
          9 months ago

          That’s the opposite of what he said. That sort of usage isn’t what ChatGPT is good for, it’s best to use it for other kinds of things.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              6
              arrow-down
              1
              ·
              9 months ago

              Feel free not to, I guess. But again, that wasn’t the point of my comment. You mistook bleistift2’s statement in the opposite way it was intended. ChatGPT’s not intended as a replacement for a search engine so evaluating it on that basis is misleading.

            • MagicShel@programming.dev
              link
              fedilink
              arrow-up
              7
              arrow-down
              5
              ·
              9 months ago

              That’s just like… your opinion, man.

              AI is going to be an important tool in the future. Decrying it as bad is similar to folks saying investing in green energy was stupid because without economies of scale they were expensive and inefficient.

              Computers are using more energy. Instead of turning them off, let’s find ways to produce energy less destructively, such as nuclear which would benefit EVs and all energy usage.

        • brbposting@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          9 months ago

          I mean an argument could be made here, right? Just thinking theoretically.

          Maxim: we want to be as eco-friendly as possible.

          Per a given task, understand the least environmentally-taxing way to accomplish the goal.

          Task requires one, two, or three/four DuckDuckGo searches? DDG away.

          Task requires five DDG searches, OR one LLM query? Language model it is.

          (LLM may well rarely be the answer there, of course, just laying out the theory!)

    • cm0002@lemmy.world
      link
      fedilink
      arrow-up
      13
      arrow-down
      10
      ·
      edit-2
      9 months ago

      Bitcoin was wasteful with little benefit, but AI has the potential to benefit humanity at large. Maybe ChatGPT itself isn’t a great example of that, but their research has gone on to spur lots of advancements in AI, advancement that have allowed AI to make all sorts of breakthroughs in areas like medicine

      • MrMcGasion@lemmy.world
        link
        fedilink
        arrow-up
        7
        arrow-down
        2
        ·
        9 months ago

        Yeah, but LLMs like ChatGPT and the like aren’t where that advancement is being made. LLMs are driving investment in the technology, but it’s just a mostly useless investor target that just happens to run on the same hardware that can be used for useful AI-powered research. Sure, it’s pushing the hardware advancement forward maybe 10-15 years faster than it might have otherwise happened, but it’s coming with a lot of wasteful baggage as well because LLMs are the golden boy investors want to to throw money at.

      • agent_flounder@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        9 months ago

        True the benefit actually exists here (how much is open for debate)

        On the other hand, we should be doing full alarm bells and running around in a panic ramping down every use of energy possible before we leave our 100 surviving progeny a lifeless rock to live on. But humans don’t work that way. By the time we are all on board it will be 100 years too late, unfortunately.

    • Flumpkin@slrpnk.net
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      Why the heck does it use so much water? It sounds like a very inefficient and stupid design to not have a closed loop.

    • masterspace@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      9 months ago

      Honest question, why is AI bad but TVs aren’t? What’s the environmental cost of millions of people watching Netflix? Using Instagram? Playing video games? Using search engines?

      If you wanna get mad at people using computers for their environmental costs why are you starting with AI?

      Bitcoin had legitimate reason to be environmentally concerned about, the algorithm was literally based on proof of wasting energy, and that would scale up overtime, AI is not like that.

        • masterspace@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          9 months ago

          TCL, Sony, Vizio, LG, Samsung, literally all of them easily do in the course of manufacturing them, not to mention the ongoing water usage of all the servers streaming you TV shows.

          Again, how is AI different then literally any other popular computer activity? The more popular it is, the greater it’s environmental cost.

          • Flying Squid@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            9 months ago

            Really? Which specific city?

            Or do you not understand that taking 6% of one specific city’s water is very different from taking that same amount of water distributed around the world?

            Also, should AI not be criticized for wasting water? Just TVs? Are there other industries where wasting large amounts of water should be ignored?

            Maybe any company using up 6% or more of a city’s municipal water system shouldn’t be allowed to do so regardless of what industry they’re in. What do you think?

              • Flying Squid@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                9 months ago

                Ah, got it, companies can do as much ecological damage at they want to and it’s the regulators fault if nothing is done about it. Also, people shouldn’t get mad at corporations for wanting to do that ecological damage just because they’re allowed to.

                • masterspace@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  9 months ago

                  Why arent you mad at video games? Are you protesting Nintendo and Sony? Their consoles consume far more power than ChatGPT.

            • masterspace@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              9 months ago

              Which specific city is ChatGPT getting its water from?

              Here’s a hint: there isn’t one, that’s referring to it’s overall usage, all around the world. It runs in Azure data centers where it is a tiny fraction of their overall compute load and water usage.

  • Dr. Dabbles@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    2
    ·
    9 months ago

    It’s not secret, people just don’t care. Manufacturers publish power and cooling data on spec sheets, but because people are easily wowed by pure garbage masquerading as breakthroughs and “future”, they simply ignore the costs and push ahead. Add in the fact that most “AI” startups are actual scams, and you’ve got a corporate incentive to pretend this isn’t doing permanent damage too.

  • cm0002@lemmy.world
    link
    fedilink
    arrow-up
    17
    arrow-down
    4
    ·
    9 months ago

    Within years, large AI systems are likely to need as much energy as entire nations.

    That doesn’t sound like they’re taking future hardware optimizations into account, we won’t be using GPUs for this purpose forever (as much as Nvidia would like that to be true lol)

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      8
      arrow-down
      3
      ·
      9 months ago

      Not to mention that increasing usage of AI means AI is producing more useful work in the process, too.

      The people running these AIs are paying for the electricity they’re using. If the AI isn’t doing enough work to make it worth that expense they wouldn’t be running them. If the general goal is “reduce electricity usage” then there’s no need to target AI, or any other specific use for that matter. Just make electricity in general cost more, and usage will go down. It’s basic market forces.

      I suspect that most people raging about AIs wouldn’t want their energy bill to shoot up, though. They want everyone else to pay for their preferences.

      • El Barto@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        9 months ago

        Not that you don’t have a point, but there’s is this theory, paradox or law or something, it escapes my memory at the time, which says that when technology advances, so do requirements. So what’s going to happen is that when hardware is 100x more efficient, the fucking corporations will use 100x more, and nothing gets solved in the pollution front.

        I am betting in renewable energy as the best way to combat the environmental issues we’re facing.

        Also, “making electricity cost more” doesn’t sound like basic market forces.

    • Dr. Dabbles@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      9 months ago

      Anything power saved by hardware design improvements will be consumed by adding more transistors. You will not be seeing a power consumption decrease. Manufacturers of this hardware have been giving talks for the past two years calling for literal power plants to be build co-resident with datacenters.

    • andrew_bidlaw@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      9 months ago

      I’m not sure if future optimization wouldn’t bring more demand. At least, that’s what my hardware and apps shown in a couple of decades. If another start up would have an ability to train with additional billion or trillion of parameters, I’m sure they would. It also leads to a wider window for poor optimization.

    • itsJoelle@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      9 months ago

      That was my thought too. I heard a take about how we may see us shift away from GPUs to purpose built PUs as a way to continue process progress now we’re getting pretty small on the silicon scale. Neural nets may be one of these special “PU”s we see.

        • Jose A Lerma@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          9 months ago

          How much power does the brain consume? In bagels. How many bagels does ChatGPT consume?

          The great thing about math is that it’s interchangeable.

          https://rpsc.energy.gov/energy-data-facts#collapse-accordion-25-3

          The average U.S. household used about 77 million British thermal units (Btu) in 2015

          https://www.inchcalculator.com/convert/british-thermal-unit-to-kilocalorie/

          The energy in kilocalories is equal to the energy in british thermal units multiplied by 0.252164.

          33,000 households x 77,000,000 Btu/household x 0.252164 kcal/Btu = 640,748,724,000 kcal

          https://www.webmd.com/diet/health-benefits-bagels

          One plain medium-sized bagel –  about 100 grams – has about 264 calories

          https://www.healthline.com/nutrition/kcal-vs-calories

          Instead, the terms calories — capitalized or not — and kcal are used interchangeably and refer to the same amount of energy

          640,748,724,000 kcal / 264 kcal/bagel = 2,427,078,500 bagels

          Your homework is finding out how much energy the brain consumes in bagels

          • El Barto@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            9 months ago

            I appreciate your answer, but I already asked that question. I’ll try to answer it myself, but let’s just say that you did the easy part.

            Also, your answer doesn’t have a unit of time. Is that what ChatGPT consumes per hour, per minute, per week?

            • Jose A Lerma@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              9 months ago

              The original article doesn’t specify a unit of time:

              Most experts agree that nuclear fusion won’t contribute significantly to the crucial goal of decarbonizing by mid-century to combat the climate crisis. Helion’s most optimistic estimate is that by 2029 it will produce enough energy to power 40,000 average US households; one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes.

              Based on context clues, it’s probably consumption per year

              • El Barto@lemmy.world
                link
                fedilink
                arrow-up
                2
                ·
                edit-2
                9 months ago

                Ok, I’ll do it my way (though yours was interesting!):

                This article says that an average U.S. house consumes about 30 KWh per day. Let’s round it down to 24 KWh, so we can say that 1 household consumes 1 KW per hour.

                According to this article, 1 Joule = 0.238902957619 kcal. And 1 Watt = 1 Joule (per second.) So, 1 KW = 1,000 Joules per second, or about 239 kcals per second. You said that 1 bagel is about 264 kcals, so to simplify things, let’s round it down to 239 kcals (and yes, food calories are really kcals - go figure), so 1 KW = 1 bagel.

                So, 1 household consumes 1 bagel per second, or 3,600 bagels per hour. In my opinion, that sounds excessive, so maybe my math is not the best. But let’s assume I did everything correctly.

                So, ChatGPT consumes the equivalent of the energy consumed by 33,000 homes. So, ChatGPT consumes 3,600 bagels times 33,000 = 118,800,000 bagels per hour. That’s almost 119 million bagels per hour!

                You came up with 2.4 billion bagels, but we don’t know if that’s per hour, per day or what. Let’s divide both numbers and see if that gives us a clue: 2.4 billion divided by 119 million is roughly 20, which is close-ish to 24. So chances are, your calculations are bagels per day.

                Again, that’s a lot of bagels!!!

                Edit: As for the brain, this site says that the brain consumes 20 watts, or 0.0056 KW per hour. We established that 1 KW = 1 bagel, so 0.0056 KW is, well, 0.0056 bagels. If we multiply that by 3,600, we get 20.16. So the brain consumes 20 bagels per hour. That can’t be right. I wish I could eat as many as 20 bagels per hour just to power my brain - I’d be a happy man!

                But anyway, 20 bagels per hour is definitely a lot less than 119 million bagels per hour.

                Oh well, I did my best.

                • Jose A Lerma@lemmy.world
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  9 months ago

                  That’s some good diligence!

                  It looks like the ecoflow values are lower:

                  https://www.inchcalculator.com/convert/kilowatt-to-btu-per-hour/

                  Since one kilowatt is equal to 3,412.14245 btu per hour

                  30 KWh/day x 365 days x 3,412 Btu/KWh = 37,361,400 Btu

                  Which is half the value I found for 2015. Does ecoflow have more current data and houses are twice as efficient? Maybe. They’re also trying to sell something, so maybe it’s based on data from their products. They don’t mention where they got it from.

                  The welovecycling conversion is off by 1000 (maybe the kilocalorie threw them off?)

                  https://www.inchcalculator.com/convert/joule-to-kilocalorie/

                  Since one kilocalorie is equal to 4,184 joules

                  1 kcal = 4,184 J so 1 J = 1/4,184 kcal = 0.00023900573613 kcal

                  Otherwise, your math was right, just off by 3 zeros, so a household is more like 3.6 bagels per hour.

                  The nist site also doesn’t specify a unit of time, but if it is 20 watts/hour (Wh) we’d only need to move it 3 places for KWh, or 0.020 KWh.

                  Too many conversions can introduce errors, so we can go from KWh to kcal directly:

                  https://www.inchcalculator.com/convert/kilowatt-to-kilocalorie-per-hour/

                  Since one kilowatt is equal to 860.420815 kilocalories per hour

                  0.020 KWh x 860 kcalh/KWh = 17.2 kcalh

                  Which, yeah, is not much of a bagel per hour. Keep in mind that the daily recommended calories for an average adult is 2000 kcal.

                  All in all, this was a fun thought experiment, so thanks for looking into it further!

  • geissi@feddit.de
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    3
    ·
    9 months ago

    Tbf, talking about the environmental costs of generative AI is just framing.
    The issue is the environmental cost of electricity, no matter what it is used for.
    If we want this to be considered in consumption then it needs to be part of the electricity price. And of course all other power sources, like combustion motors, need to also price in external costs.

    • 🔰Hurling⚜️Durling🔱@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      9 months ago

      It should be considered, an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI, the AI server has higher power requirements (it may not be as wide of a margin from my first comparison, but there is one), now multiply that AI server and add hundreds more and you start seeing a considerable uptick in power usage.

      • geissi@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        9 months ago

        an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI

        And if external costs are priced into the cost of electricity then that will be reflected in the cost of operating these devices.
        Also there are far more data servers than servers running AI which increases the total effect they have.

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    13
    arrow-down
    16
    ·
    9 months ago

    It’s consuming the energy equivalent of 33,000 homes, okay. Is it doing work equivalent to 33,000 people or more? Seems likely to me.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      3
      ·
      9 months ago

      Exactly my thought too.

      For so long human progress has been limited by population size.

      A large part of the reason we lept so far over the past century was the huge increase in population size which allowed for greater subspecialization.

      But that population growth is unsustainable if not already well past the practical limit.

      If we can successfully get to a point where we have exponential gains in productivity unseen in human history while also decoupling that progress from the massive resources it would require from more humans, we might be able to outpace the collective debts we’ve racked up as a species without obsessive focus on procreation (like Musk) as necessary to burden the next generation with our fuck ups.

      33,000 homes is way less than I’d have thought given the degree to which it is being used. And the promises of future hardware revisions like with photonics means we’ll be looking at exponential decreases in energy consumption while also seeing increases in processing power.