• Thorry84@feddit.nl
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    1
    ·
    edit-2
    17 hours ago

    There is another factor in this which often gets overlooked. A LOT of the money invested right now is for the Nvidia chips and products based around them. As many gamers are painfully aware, these chips devalue very quickly. With the progress of technology moving so fast, what was once a top of the line unit gets outclassed by mid tier hardware within a couple of years. After 5 years it’s usefulness is severely diminished and after 10 years it is hardly worth the energy to run them.

    This means the window for return on investment is a lot shorter than usual in tech. For example when creating a software service, there would be an upfront investment for buying the startup that created the software. Then some scaling investment in infrastructure and such. But after that it turns into a steady state where the input of money is a lot lower than revenue from the customer base that was grown. This allows to get returns on investment for many years after that initial investment and growth phase.

    With this Ai shit it works a bit different. If you want to train and run the latest models in order to remain competitive in the market, you would need to continually buy the latest hardware from Nvidia. As soon as you start running on older hardware, your product would be left behind and with all the competition out there users would be lost very quickly. It’s very hard to see how the trillions of dollars invested now are ever going to be recovered within the span of five years. Especially in a time where so much companies are dumping their products for very low prices and sometimes even for free.

    This bubble has to burst and it is going to be bad. For the people who were around when the dotcom bubble burst, this is going to be much worse than that ever was.

    • mojofrododojo@lemmy.world
      link
      fedilink
      English
      arrow-up
      27
      ·
      edit-2
      17 hours ago

      yeah datacenters never really aged well, and making them gpu dependent is going mean they age like hot piss. and since they’re ai-dedicated gpus, they can’t even resell them lol.

      all this investment, for what? so some chud can have a picture of taylor swift with 4 tits?

      fucking idiots

      • howrar@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        16 hours ago

        I don’t see why they can’t be resold. As long as there’s a market for new AI hardware, there will continue to be a market for the older stuff. You don’t need the latest and greatest for development purposes, or things that scale horizontally.

        • mojofrododojo@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          16 hours ago

          I didn’t say they couldn’t be resold, they simply won’t have as wide a potential user market like an generic GPU would. But think about it for a sec, you’ve got thousands of AI dedicated gpu’s going stale whenever a datacenter gets overhauled or a datacenter goes bust.

          that’s gonna put a lot more product on the market that other datacenters aren’t going to touch - no one puts used hardware in their racks - so who’s gonna gobble up all this stuff?

          not the gamers. who else needs this kind of stuff?

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            3 hours ago

            no one important puts used hardware in their racks

            FTFY. Just about every msp I’ve worked for has cut corners and went with 2nd hand (or possibly grey market) hardware to save a buck, including the ones who colo in “real” data centers. I would not be surprised to find that we’re onboarding these kinds of cards to make a bespoke AI platform for our software customers here in a few years.

          • addie@feddit.uk
            link
            fedilink
            English
            arrow-up
            3
            ·
            11 hours ago

            I’m not sure that they’re even going to be useful for gamers. Datacenter GPUs require a substantial external cooling solution to stop them from just melting. Believe NVidia’s new stuff is liquid-only, so even if you’ve got an HVAC next to your l33t gaming PC, that won’t be sufficient.

            • mojofrododojo@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              9 hours ago

              not just those constraints, good luck getting a fucking video signal out of 'em when they literally don’t have hdmi/dp or any other connectors.

          • jkercher@programming.dev
            link
            fedilink
            English
            arrow-up
            5
            ·
            13 hours ago

            Also depends how hard the AI runs them. A good chunk of the graphics cards that were used as miners came out on life support if not completely toasted. Games generally don’t run the piss out of them like that 24/7, and many games are still CPU bound.

    • panda_abyss@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      16 hours ago

      They’ll write this off as a loss and offset their corporate taxes

      Also china is a great example that you do not need all the latest hardware, but it does help