• howrar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    16 hours ago

    I don’t see why they can’t be resold. As long as there’s a market for new AI hardware, there will continue to be a market for the older stuff. You don’t need the latest and greatest for development purposes, or things that scale horizontally.

    • mojofrododojo@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      16 hours ago

      I didn’t say they couldn’t be resold, they simply won’t have as wide a potential user market like an generic GPU would. But think about it for a sec, you’ve got thousands of AI dedicated gpu’s going stale whenever a datacenter gets overhauled or a datacenter goes bust.

      that’s gonna put a lot more product on the market that other datacenters aren’t going to touch - no one puts used hardware in their racks - so who’s gonna gobble up all this stuff?

      not the gamers. who else needs this kind of stuff?

      • Passerby6497@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        3 hours ago

        no one important puts used hardware in their racks

        FTFY. Just about every msp I’ve worked for has cut corners and went with 2nd hand (or possibly grey market) hardware to save a buck, including the ones who colo in “real” data centers. I would not be surprised to find that we’re onboarding these kinds of cards to make a bespoke AI platform for our software customers here in a few years.

      • addie@feddit.uk
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 hours ago

        I’m not sure that they’re even going to be useful for gamers. Datacenter GPUs require a substantial external cooling solution to stop them from just melting. Believe NVidia’s new stuff is liquid-only, so even if you’ve got an HVAC next to your l33t gaming PC, that won’t be sufficient.

        • mojofrododojo@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          9 hours ago

          not just those constraints, good luck getting a fucking video signal out of 'em when they literally don’t have hdmi/dp or any other connectors.

      • jkercher@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        13 hours ago

        Also depends how hard the AI runs them. A good chunk of the graphics cards that were used as miners came out on life support if not completely toasted. Games generally don’t run the piss out of them like that 24/7, and many games are still CPU bound.