• Wildmimic@anarchist.nexus
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    4
    ·
    10 hours ago

    OP, this statement is bullshit. you can do about 5 million requests for ONE flight.

    i’m gonna quote my old post:

    I had the discussion regarding generated CO2 a while ago here, and with the numbers my discussion partner gave me, the calculation said that the yearly usage of ChatGPT is appr. 0.0017% of our CO2 reduction during the covid lockdowns - chatbots are not what is kiling the climate. What IS killing the climate has not changed since the green movement started: cars, planes, construction (mainly concrete production) and meat.

    The exact energy costs are not published, but 3Wh / request for ChatGPT-4 is the upper limit from what we know (and thats in line with the appr. power consumption on my graphics card when running an LLM). Since Google uses it for every search, they will probably have optimized for their use case, and some sources cite 0.3Wh/request for chatbots - it depends on what model you use. The training is a one-time cost, and for ChatGPT-4 it raises the maximum cost/request to 4Wh. That’s nothing. The combined worldwide energy usage of ChatGPT is equivalent to about 20k American households. This is for one of the most downloaded apps on iPhone and Android - setting this in comparison with the massive usage makes clear that saving here is not effective for anyone interested in reducing climate impact, or you have to start scolding everyone who runs their microwave 10 seconds too long.

    Even compared to other online activities that use data centers ChatGPT’s power usage is small change. If you use ChatGPT instead of watching Netflix you actually safe energy!

    Water is about the same, although the positioning of data centers in the US sucks. The used water doesn’t disappear tho - it’s mostly returned to the rivers or is evaporated. The water usage in the US is 58,000,000,000,000 gallons (220 Trillion Liters) of water per year. A ChatGPT request uses between 10-25ml of water for cooling. A Hamburger uses about 600 galleons of water. 2 Trillion Liters are lost due to aging infrastructure . If you want to reduce water usage, go vegan or fix water pipes.

    Read up here !

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      6 hours ago

      If you want to look at it another way, if you assume every single square inch of silicon from TSMC is Nvidia server accelerators/AMD EPYCs, every single one running AI at full tilt 24/7/365…

      Added up, it’s not that much power, or water.

      That’s unrealistic, of course, but that’s literally the physical cap of what humanity can produce at the moment.

    • Reygle@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      13
      ·
      7 hours ago

      If you only include chat bots, your numbers look good. Sadly reality isn’t in “chat bots”.

        • Reygle@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          3 hours ago

          Image/Video generation, analysis (them scrubbing the entire public internet) consumes far, far more than someone asking an AT “grok is this true”

          • lets_get_off_lemmy@reddthat.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            35 minutes ago

            Do you have a source for this claim? I see this report by Google and MIT Tech Review that says image/video generation does use a lot of energy compared to text generation.

            Taking the data from those articles, we get this table:

            AI Activity Source Energy Use (per prompt) Everyday Comparison
            Median Gemini Text Prompt Google Report 0.24 Wh Less energy than watching a 100W TV for 9 seconds.
            High-Quality AI Image MIT Article ~1.22 Wh Running a standard microwave for about 4 seconds.
            Complex AI Text Query MIT Article ~1.86 Wh Roughly equivalent to charging a pair of wireless earbuds for 2-3 minutes.
            Single AI Video (5-sec) MIT Article ~944 Wh (0.94 kWh) Nearly the same energy as running a full, energy-efficient dishwasher cycle.
            “Daily AI Habit” MIT Article ~2,900 Wh (2.9 kWh) A bit more than an average US refrigerator consumes in a full 24-hour period.
            • MangoCats@feddit.it
              link
              fedilink
              English
              arrow-up
              1
              ·
              47 seconds ago

              Another way of looking at this: A “Daily AI Habit” on your table is about the same as driving a Tesla 10 miles…

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        6 hours ago

        I’m not sure what you’re referencing. Imagegen models are not much different, especially now that they’re going transformers/MoE. Video gen models are chunky indeed, but more rarely used, and they’re usually much smaller parameter counts.

        Basically anything else machine learning is an order of magnitude less energy, at least.