• Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    3
    ·
    2 days ago

    Of course it should be covered in solar panels but so should most roofs everywhere but this single roof would be less than a drop in the bucket.

    A square meter solar panel gives you about 100 watts while the sun is at it’s highest point, and only when aimed directly at the sun. Typically over the entire day, the average will be a fraction of that

    Meanwhile these servers use multiple CPUs that each take around 200 watts. A single server can take between 1-5 kilowatt in power. A single rack than carry dozens of those server’s, so you see that you’d need way, waaaayyy more solar panels to make up for all of that

    Again, not saying they shouldn’t. All buildings should have solar panel roofs, but for this one building it won’t do much to the point that the difference would be a blip

    • PagPag@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      When’s the last time you looked into this?

      I just went fully off grid and I have a relatively large house and workshop.

      The panels I used, which are great but aren’t the absolute best on the market come out to about 231W per sq. meter.

      I have a 39kW system installed just for my house. It’s overkill, yeah but I plan for the future (telling the regional power monopoly to go fuck themselves for the next 30 years).

      Covering one of these centers with solar would absolutely make a huge impact. Not only by providing power during the day but also with keeping the building cooler.

      For reference, the panels I have (65 of), coupled with 100kWh battery bank.

      https://www.runergy.com/wp-content/uploads/download/DH156N8-30F.pdf

      • RisingSwell@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Even at over double the other guys estimate on power per area, it isn’t even touching the requirements of major data centres. What it takes to run a normal house is tiny, they likely have servers that individually draw more power than my entire household, and they have hundreds if not thousands of these servers.

        Do it anyway because solar is the closest thing to free power we have, but it isn’t gonna cover the building.

        • PagPag@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          Well of course. Which is why I mentioned it making a significant impact. Full offset wouldn’t be feasible without it being as large of a scope a the data center construction itself; not even considering storage requirements.

          The unfortunate likelihood of projections (currently taking shape) being well understood, and accepted, at the time is extremely high.

          It’s a win-win if you’re the owner of the server farm who had closed door discussions with the power company beforehand. I mean the citizens don’t win, but when has this ever been a concern?

          If it was in their best interests financially, it would be included in the financial model before construction. My guess is that it was more appealing to just cut deals with various players.

    • Tja@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      A square meter of solar gives you over 200 watts for many hours of the day in realistic conditions in Europe/Canada, more in the US or tropical countries.

    • boonhet@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      2 days ago

      You’re right about the general idea, but I think you’re even underestimating the scale here.

      I don’t think these servers will be doing much on CPU, they’ll be on GPUs. HPE will sell you a 48 rack unit behemoth with 36 Blackwell GB200s for a total of 72 GPUs and 36 CPUs. The CPUs are actually negligible here, but each of the 36 units use a total of 2700 watts (single GPU itself is supposedly 1200 watts so that would make the CPU 300 watts?)

      36 * 2.7 = 97.2 kilowatts. You put just a hundred of these in a data center and you’re talking over 10 megawatts once cooling and everything is factored in. So this is what, 100k m^2 of solar panels for 100 racks?

      You’d want them to be running most of the time too, idle hardware is just a depreciating asset. Say they run 75% of the time. 0.75 * 10 * 24 * 365 = 65700 MWh which I will not even convert to gigawatt hours to simplify this: The average American household uses about ~11 MWh of electrical energy per year. A single AI-focused data center without even all that many racks uses as much power as ~6000 households. They’re building them all over the country, and in reality I think they’re actually way bigger than what I mentioned. It’s putting a significant dent in the power grid, to the point AI companies should be required to commission nuclear power plants before being allowed to build their data centers.