“I’ve been saving for months to get the Corsair Dominator 64GB CL30 kit,” one beleagured PC builder wrote on Reddit. “It was about $280 when I looked,” said u/RaidriarT, “Fast forward today on PCPartPicker, they want $547 for the same kit? A nearly 100% increase in a couple months?”

  • humanspiral@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    Indeed awful. AFAIU, it is stockpiling HBM memory ahead of next generation “GPUs”, rather than existing product volume. I’m especially disgusted that prices in non-tariff countries are higher than in US. If this were to last, gddr5/6 computer ram would start to make sense. NVIDIA has been behind starving memory supplies for competing platforms (usually AMD) in the past.

    This pricing is both huge inflation, but also huge drop in sales, because we have to wait for it to get sensible.

  • Aceticon@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    21
    ·
    6 hours ago

    Doesn’t Windows 11 in practice require even more memory than Windows 10 to operate with decent performance?

    Meanwhile my Linux gaming PC seems to actually use less memory than back when it was a Windows machine.

    • chiliedogg@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      4 hours ago

      My work laptop was upgraded to Windows 11 and performance has severely suffered.

      As someone who usually uses 3 monitors (sometimes 4) and does GIS, it’s an issue.

      • LiveLM@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 hours ago

        Are you using scaling?
        On my work laptop dragging a window from one monitor to the next would make them “snag” in between borders as they struggled and stuttered trying to change the scale.
        It looked soooo fucking stupid I couldn’t believe my eyes

        • Brkdncr@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          You can tell just by the fan noise or monitoring cpu usage. Win 11 uses more cpu regularly.

          I’m not sure if it’s bad, because they changed other things too like the start menu, and it’s less responsive.

          Single app performance is about the same as 10.

        • HugeNerd@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 hours ago

          Many such moments with the PCs at work. I can’t wait to retire and never have to deal with anything modern again. What absolute dogshit and slop computers have become.

    • Buddahriffic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      16 minutes ago

      I don’t think the bubble bursting will slow AI that much, it’ll just be a round of hot potatoe over, the losers will lose their money and others will come in hoping to be profitable since they can skip a bunch of R&D costs.

      AI is overhyped, but just like the internet after the dotcom bubble burst, it’s not going anywhere.

      Plus I suspect that this time will be a dollar collapse rather than stock market collapse, which would mean prices would go up even more.

    • Dremor@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      It always does. I remember buying my DDR4 RAM 200€, and two months later the same kit was 550€. Some month later it was back at 220€.

  • utopiah@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    5 hours ago

    Genuine question here, for a “normal” computer user, say somebody who :

    • browses the Web
    • listens to music, play videos, etc
    • sometimes plays video games, even 2025 AAAs and already has a GPU relatively recent and midrange, say something from e.g. 2020
    • even codes something of a normal size, let’s say up to Firefox size (which is huge)

    … which task does require more than say 32Go?

    • Devjavu@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      3 hours ago

      If by normal you average, they don’t even really need 16gb.
      Creative work can gobble up ram, heavy ass multitasking does as well.
      So it’s more in the digitally productive professional or hobbyist cases where you need such amounts as a person.

      For development high amounts of rams can be useful for all sorts of stuff, it’s not just compiling, but also testing, though 32 is often enough.

    • HubertManne@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      the worst part is I have always been stingy about upgrading my ram figuring I could always do it later so I run real lean. I only recently went to 16 and ran at 8 for a long while but im pretty sure 32 would help a lot.

  • SabinStargem@lemmy.today
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    13 hours ago

    Far as RAM goes, it will become a good thing: it gives companies incentive to invest into the development of bigger RAM, more speed, and making the motherboard bandwidth big enough to handle it.

    The next big generation of hardware will be much better IMO, simply because the companies will have to compete by their merits. The downside is not having enough supply right now, but once the logistics and tech is in place, even non-AI people will benefit.

    • Frezik@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      11
      ·
      5 hours ago

      No, it won’t. The DRAM market is dominated by three companies, and they’ve colluded before. They get their wrist slapped by some government body, they promise not to do it again, and then they wait a few years and do it again.

      • boonhet@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        5 hours ago

        They’ll also have an electrical outage the millisecond demand starts to go down, so they’ll have to sell the old stock at inflated prices first before restarting production, oopsie-woopsie

    • plyth@feddit.org
      link
      fedilink
      English
      arrow-up
      9
      ·
      6 hours ago

      The downside is not having enough supply right now, but once the logistics and tech is in place, even non-AI people will benefit.

      Have you forgotten that they agreed to reduce production to stabilize prices? Capacity is not the real bottleneck.

  • lightnsfw@reddthat.com
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    edit-2
    16 hours ago

    Who’s bewildered? Of course this was going to happen. Everything enjoyable about life is being ruined. It’s not surprising at all.

  • Bongles@lemmy.zip
    link
    fedilink
    English
    arrow-up
    25
    ·
    20 hours ago

    This seems like an appropriate place for me to bitch:

    2 months ago I bought a new pre-built pc. It should’ve had 64gb of ram but had 32gb. They said the sticks they used were out of stock so they gave me a credit for $100 USD. I spent the 100 on 32gb more of what I thought was the exact same ram. I fucked up and bought a slightly higher speed so they wouldn’t work together after I tried for an afternoon. I also checked the correct listing i should’ve bought but it was more expensive, at about $125.

    I gave up and decided I’d just buy the faster ram again when it came back, rather than return it and get the correct one. It went out of stock in the time it took me to get my order so I figured I’d just wait.

    2 MONTHS later, it never came back in stock but an almost identical pair, with slightly different timing, is in stock right now at $216. If i had any idea this was coming in just 2 months, I could’ve just bought 64gb at once and started fresh, or corrected my mistake by returning what I bought.

    So i guess I’ll continue waiting, but hey at least notepad has copilot in it.

      • Bongles@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        I’ll bet there is, but I gave it a solid few hours without really knowing what I was doing

    • SailorFuzz@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      I just don’t see the value of having 64gb of RAM. Not for the conventional user, not for gamers, not for the average power user either. Maybe there’s a need if you’re doing a lot of video editing and large file manipulation… but like… I would argue that MOST people, unless they’re trying to play AAA games while streaming and gooning don’t need more than 16gb

      I have 32gb and I’ve never topped it out. And yea, Windows eats a lot (I really need to give up the ghost and migrate to Linux) but even still, 32gb, and I don’t even get close. 64gb is just going to be a lot of unused space. Bigger number doesn’t mean better. I doubt you’d even notice unless you fall into the previously mentioned category of users.

      • Bongles@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        At this point, I originally paid for 64 so I’m trying to get there out of principal, and I’d be there if I paid more attention.

        I like to dick around with my nerd stuff and I did have the pc lock up because I used all of the 32gb ram, but I suspect 64 would’ve had the same issue in that particular instance.

        In either case, my last pc buy was a decade ago, where 16GB was more than you’d need. If prices are reasonable again, getting myself to 64GB would just make sure I’m set for another decade, probably.

    • timhayes1991@lemmy.zip
      link
      fedilink
      English
      arrow-up
      9
      ·
      13 hours ago

      I always thought ram of different speeds worked together, they just were run at the speed of the slowest stick.

  • Tim_Bisley@piefed.social
    link
    fedilink
    English
    arrow-up
    218
    ·
    1 day ago

    AI increases my power utility bill
    AI takes my water
    AI increases the price of GPUs
    AI increases the price of RAM
    AI makes my search results worse and slower
    AI is inserted into every website, app, program, and service making them all worse

    All so businesses and companies can increase productivity, reduce staff, and then turn around and increase prices to customers.

    • Capricorn_Geriatric@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      All so businesses and companies can increase productivity, reduce staff, and then turn around and increase prices to customers.

      As if. The only thing AI is to businesses is a lost bet. And they don’t like losing. So they’re betting even more, hoping some shiny “AGI” starts existing if they throw enough money into wasting other resources onto the AI bandwagon.

    • Credibly_Human@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 hours ago

      Lets not also forget that for a bunch of them, they want to completely replace lightning fast, simple UI with AI so they don’t need their own programmers, and your experience doing things is outright painful.

      • XLE@piefed.social
        link
        fedilink
        English
        arrow-up
        6
        ·
        18 hours ago

        It’s a win-win with staff layoffs. Businesses that want to lay people off have a convenient scapegoat and AI companies receive undeserved praise.

        A win-win for everyone but the employees, of course.

      • Passerby6497@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        24 hours ago

        Seriously. The amount of time spent handholding the AIs no one asked to use more than offsets any supposed productivity gains.

        • HubertManne@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          What cracks me up is things they are doing that cost more (both money and resources) but don’t help at all. Like instead of the press for this or this you get an llm which can only send you to the exact same options. Adding it to IDE’s where by and large it does stuff you could get with a variety of addons.

        • Aceticon@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 hours ago

          IMHO, most people’s time usage perception tends to be heavilly skewed toward weighing higher the time taken in the main task - say, creating the code of a program - rather than the secondary (but, none the less, required before completion) tasks like fixing the code.

          Notice how so many coders won’t do the proper level of analysis and preparation before actually starting coding - they want to feel like they’re “doing the work” which for them is the coding part, whilst the analysis doesn’t feel like “doing the work” for a dev, so they prematurelly dive into coding and end up screwed with things like going down a non-viable implementation route or missing in the implementation some important requirement detail with huge implications on the rest that would have been detected during analysis.

          (I also think that’s the reason why even without AI people will do stupid “time savers” in the main task like using short variable names that then screw them in secondary tasks like bug-fixing or adding new requirements to the program later because it makes it far harder to figure out what the code is doing)

          AI speeds up what people feel is the main task - creating the code - but that’s de facto more than offset by time lost on supposedly secondary work that doesn’t feel as much as “doing the work” so doesn’t get counted the same.

          This is why when it actually gets measured independently and properly by people who aren’t just trusting their own feeling of “how long did it took” (or people who, thanks to experience, actually do properly measure the total time taken including in support activities, rather than just trusting their own subjective perception) it turns out that, at least in software development, AI actually slightly reduces productivity.

          • HubertManne@piefed.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            I disagree here. You can try to build everything out theoretically and then realize the issues once you have a massive thing that does not work. This is literally waterfall vs agile. Iteration is a proper way to do things. As far as AI im not sure it speeds up coding vs companies that will purchase the addons and programs that already can speed it up. I have seen so many companies push back on purchasing jet brains but it seems like ai spend is unlimited.

            • Aceticon@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 hour ago

              I think you’re confusing doing analysis before coding with doing all analysis before coding.

              If you do Agile properly (so including Use Cases with user prioritization and User feedback - so the whole system, not just doing the fashionable bits like stand up meetings and then claiming “we do Agile development”) you do analysis before development as part of evaluating how long it will take to implement the requirements contained in each Use Case. In fact this part of Agile actually pushes people to properly think through the problem - i.e. do the fucking analysis - before they start coding, just in bit-sized easy to endure blocks.

              Further, in determining which Use Cases depend on which Use Cases you’re doing a form of overall, system-level analysis.

              Also you definitelly need some level of overall upfront analysis no matter what: have a go at developing a mission critical high performance system on top of a gigantic dataset by taking a purist “we only look at uses cases individually and ignore the system-level overview” approach (thus, ignoring the general project technical needs that are derived from the size of the data, data integrity and performance requirements) and let me know how well it goes when half way down the project you figure out your system architecture of a single application instance with a database that can’t handle distributed transactions can’t actually deliver on those requirements.

              You can refactor code and low level design in a reasonable amount of time, but refactoring system level design is a whole different story.

              Of course, in my experience only a handful of shops out there do proper Agile for large projects: most just do the kiddie version - follow the herd by doing famous “agile practices” without actually understanding the process, how it all fits in it and which business and development environments is it appropriate to use in and which it is not.

              I could write a fucking treatise about people thinking they’re “doing Agile” whilst in fact they’re just doing a Theatre Of Agile were all they do is play at it by acting the most famous bits.

  • carrylex@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    1
    ·
    23 hours ago

    Nice article but the numbers are a lot lower here in the EU.

    While there is some pricing increase it’s currently more around 50% and not 100%.

    The selected kit is also extremely expensive (350€ was ~300€) - similar kits are available for a lot less (270€ was ~180€) - so I doubt that anyone was buying it in the first place.

    I also think it’s not completely AI related but more likely that this is another RAM price fixing scandal happening right now. Pretty much the same that we see today happend in 2017-2018.

      • 46_and_2@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 hours ago

        Bruh, my whole mid-to-high range gaming PC costs 850 to 2K euro. What is the intended use of such an expensive RAM kit? Is it LLMs again?

        • SaveTheTuaHawk@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 hours ago

          Scientific applications. My lab has a PC running 196GB RAM for processing 3D and 4D microscopy voxel datasets.

          • boonhet@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            5 hours ago

            Assuming that you don’t need the absolute tightest timings and highest speed, you can get 192 GB from Corsair for “just” 660 euros where I live, pretty far still from 2000 euros. The speed and timings are the same as the 1300 euro kit, also from Corsair, it’s just that the cheaper kit has no RGB.

            So at 2k EUR I’m assuming it’s going to be either more than 192 GB (in which case, is that even a desktop motherboard or are we talking about servers?) or some super high speed RAM.