“I’ve been saving for months to get the Corsair Dominator 64GB CL30 kit,” one beleagured PC builder wrote on Reddit. “It was about $280 when I looked,” said u/RaidriarT, “Fast forward today on PCPartPicker, they want $547 for the same kit? A nearly 100% increase in a couple months?”
Just in time for all those windows PCs that MS is trying to force people to upgrade
Indeed awful. AFAIU, it is stockpiling HBM memory ahead of next generation “GPUs”, rather than existing product volume. I’m especially disgusted that prices in non-tariff countries are higher than in US. If this were to last, gddr5/6 computer ram would start to make sense. NVIDIA has been behind starving memory supplies for competing platforms (usually AMD) in the past.
This pricing is both huge inflation, but also huge drop in sales, because we have to wait for it to get sensible.
Doesn’t Windows 11 in practice require even more memory than Windows 10 to operate with decent performance?
Meanwhile my Linux gaming PC seems to actually use less memory than back when it was a Windows machine.
My work laptop was upgraded to Windows 11 and performance has severely suffered.
As someone who usually uses 3 monitors (sometimes 4) and does GIS, it’s an issue.
Oh yeah. If you’ve got 4 GB ram, win 11 is going to absolutely skullfuck that machine.
Are you using scaling?
On my work laptop dragging a window from one monitor to the next would make them “snag” in between borders as they struggled and stuttered trying to change the scale.
It looked soooo fucking stupid I couldn’t believe my eyesYou can tell just by the fan noise or monitoring cpu usage. Win 11 uses more cpu regularly.
I’m not sure if it’s bad, because they changed other things too like the start menu, and it’s less responsive.
Single app performance is about the same as 10.
Many such moments with the PCs at work. I can’t wait to retire and never have to deal with anything modern again. What absolute dogshit and slop computers have become.
deleted by creator
soon the bubble will burst and RAM will be so cheap! (I hope)
I don’t think the bubble bursting will slow AI that much, it’ll just be a round of hot potatoe over, the losers will lose their money and others will come in hoping to be profitable since they can skip a bunch of R&D costs.
AI is overhyped, but just like the internet after the dotcom bubble burst, it’s not going anywhere.
Plus I suspect that this time will be a dollar collapse rather than stock market collapse, which would mean prices would go up even more.
Buy one RAM and you get two GPUs thrown in for free!
It always does. I remember buying my DDR4 RAM 200€, and two months later the same kit was 550€. Some month later it was back at 220€.
RAM used to cost 8x that. It’s the reason why HP cases have locks.
RAM prices fluctuate but always return to normal pretty fast, at least compared to gpus and cpus
Genuine question here, for a “normal” computer user, say somebody who :
- browses the Web
- listens to music, play videos, etc
- sometimes plays video games, even 2025 AAAs and already has a GPU relatively recent and midrange, say something from e.g. 2020
- even codes something of a normal size, let’s say up to Firefox size (which is huge)
… which task does require more than say 32Go?
If by normal you average, they don’t even really need 16gb.
Creative work can gobble up ram, heavy ass multitasking does as well.
So it’s more in the digitally productive professional or hobbyist cases where you need such amounts as a person.For development high amounts of rams can be useful for all sorts of stuff, it’s not just compiling, but also testing, though 32 is often enough.
You probably dont but 32gb ddr5 is minimum 200$ right now for the slowest.
God fucking damn it… Now they want our RAM?!
I need that for all my Chrome tabs and pet protogen, you fucking clankers!
the worst part is I have always been stingy about upgrading my ram figuring I could always do it later so I run real lean. I only recently went to 16 and ran at 8 for a long while but im pretty sure 32 would help a lot.
Far as RAM goes, it will become a good thing: it gives companies incentive to invest into the development of bigger RAM, more speed, and making the motherboard bandwidth big enough to handle it.
The next big generation of hardware will be much better IMO, simply because the companies will have to compete by their merits. The downside is not having enough supply right now, but once the logistics and tech is in place, even non-AI people will benefit.
No, it won’t. The DRAM market is dominated by three companies, and they’ve colluded before. They get their wrist slapped by some government body, they promise not to do it again, and then they wait a few years and do it again.
They’ll also have an electrical outage the millisecond demand starts to go down, so they’ll have to sell the old stock at inflated prices first before restarting production, oopsie-woopsie
The downside is not having enough supply right now, but once the logistics and tech is in place, even non-AI people will benefit.
Have you forgotten that they agreed to reduce production to stabilize prices? Capacity is not the real bottleneck.
Who’s bewildered? Of course this was going to happen. Everything enjoyable about life is being ruined. It’s not surprising at all.
Wow I just checked the laptop kit I bought a month ago it is 50% more
This seems like an appropriate place for me to bitch:
2 months ago I bought a new pre-built pc. It should’ve had 64gb of ram but had 32gb. They said the sticks they used were out of stock so they gave me a credit for $100 USD. I spent the 100 on 32gb more of what I thought was the exact same ram. I fucked up and bought a slightly higher speed so they wouldn’t work together after I tried for an afternoon. I also checked the correct listing i should’ve bought but it was more expensive, at about $125.
I gave up and decided I’d just buy the faster ram again when it came back, rather than return it and get the correct one. It went out of stock in the time it took me to get my order so I figured I’d just wait.
2 MONTHS later, it never came back in stock but an almost identical pair, with slightly different timing, is in stock right now at $216. If i had any idea this was coming in just 2 months, I could’ve just bought 64gb at once and started fresh, or corrected my mistake by returning what I bought.
So i guess I’ll continue waiting, but hey at least notepad has copilot in it.
could be a bios setting that allows your ram to work at lower speed.
I’ll bet there is, but I gave it a solid few hours without really knowing what I was doing
I just don’t see the value of having 64gb of RAM. Not for the conventional user, not for gamers, not for the average power user either. Maybe there’s a need if you’re doing a lot of video editing and large file manipulation… but like… I would argue that MOST people, unless they’re trying to play AAA games while streaming and gooning don’t need more than 16gb
I have 32gb and I’ve never topped it out. And yea, Windows eats a lot (I really need to give up the ghost and migrate to Linux) but even still, 32gb, and I don’t even get close. 64gb is just going to be a lot of unused space. Bigger number doesn’t mean better. I doubt you’d even notice unless you fall into the previously mentioned category of users.
At this point, I originally paid for 64 so I’m trying to get there out of principal, and I’d be there if I paid more attention.
I like to dick around with my nerd stuff and I did have the pc lock up because I used all of the 32gb ram, but I suspect 64 would’ve had the same issue in that particular instance.
In either case, my last pc buy was a decade ago, where 16GB was more than you’d need. If prices are reasonable again, getting myself to 64GB would just make sure I’m set for another decade, probably.
Just curious, what do you actually use which requires more than 32Go?
At the moment, not much. I’m trying to learn more and more and do more and more, really it’s just that I paid for 64, but then the seller and I both fucked up and I’m still at 32 now, which bugs me. Then at that same exact time frame, the prices skyrocketed.
Sorry to hear, hopefully you’ll get a solution soon. Thanks for the clarification
Hey what’s the Go unit? Is that short for GiB?
giga octet, like byte https://en.wikipedia.org/wiki/Octet_(computing)
Thank you!
I always thought ram of different speeds worked together, they just were run at the speed of the slowest stick.
I thought so too, but after a few hours I gave up.
In theory yes. In practice it depends on how finicky your motherboard is.
Dang, now i know why micron stock has gone wild.
AI increases my power utility bill
AI takes my water
AI increases the price of GPUs
AI increases the price of RAM
AI makes my search results worse and slower
AI is inserted into every website, app, program, and service making them all worseAll so businesses and companies can increase productivity, reduce staff, and then turn around and increase prices to customers.
So companies can “appear” to increase productivity
Increase productivity?
All so businesses and companies can increase productivity, reduce staff, and then turn around and increase prices to customers.
As if. The only thing AI is to businesses is a lost bet. And they don’t like losing. So they’re betting even more, hoping some shiny “AGI” starts existing if they throw enough money into wasting other resources onto the AI bandwagon.
Lets not also forget that for a bunch of them, they want to completely replace lightning fast, simple UI with AI so they don’t need their own programmers, and your experience doing things is outright painful.
Isn’t capitalism a blast?
Right, what you said, but I’d remove the part where it increases productivity.
It’s a win-win with staff layoffs. Businesses that want to lay people off have a convenient scapegoat and AI companies receive undeserved praise.
A win-win for everyone but the employees, of course.
Seriously. The amount of time spent handholding the AIs no one asked to use more than offsets any supposed productivity gains.
What cracks me up is things they are doing that cost more (both money and resources) but don’t help at all. Like instead of the press for this or this you get an llm which can only send you to the exact same options. Adding it to IDE’s where by and large it does stuff you could get with a variety of addons.
IMHO, most people’s time usage perception tends to be heavilly skewed toward weighing higher the time taken in the main task - say, creating the code of a program - rather than the secondary (but, none the less, required before completion) tasks like fixing the code.
Notice how so many coders won’t do the proper level of analysis and preparation before actually starting coding - they want to feel like they’re “doing the work” which for them is the coding part, whilst the analysis doesn’t feel like “doing the work” for a dev, so they prematurelly dive into coding and end up screwed with things like going down a non-viable implementation route or missing in the implementation some important requirement detail with huge implications on the rest that would have been detected during analysis.
(I also think that’s the reason why even without AI people will do stupid “time savers” in the main task like using short variable names that then screw them in secondary tasks like bug-fixing or adding new requirements to the program later because it makes it far harder to figure out what the code is doing)
AI speeds up what people feel is the main task - creating the code - but that’s de facto more than offset by time lost on supposedly secondary work that doesn’t feel as much as “doing the work” so doesn’t get counted the same.
This is why when it actually gets measured independently and properly by people who aren’t just trusting their own feeling of “how long did it took” (or people who, thanks to experience, actually do properly measure the total time taken including in support activities, rather than just trusting their own subjective perception) it turns out that, at least in software development, AI actually slightly reduces productivity.
I disagree here. You can try to build everything out theoretically and then realize the issues once you have a massive thing that does not work. This is literally waterfall vs agile. Iteration is a proper way to do things. As far as AI im not sure it speeds up coding vs companies that will purchase the addons and programs that already can speed it up. I have seen so many companies push back on purchasing jet brains but it seems like ai spend is unlimited.
I think you’re confusing doing analysis before coding with doing all analysis before coding.
If you do Agile properly (so including Use Cases with user prioritization and User feedback - so the whole system, not just doing the fashionable bits like stand up meetings and then claiming “we do Agile development”) you do analysis before development as part of evaluating how long it will take to implement the requirements contained in each Use Case. In fact this part of Agile actually pushes people to properly think through the problem - i.e. do the fucking analysis - before they start coding, just in bit-sized easy to endure blocks.
Further, in determining which Use Cases depend on which Use Cases you’re doing a form of overall, system-level analysis.
Also you definitelly need some level of overall upfront analysis no matter what: have a go at developing a mission critical high performance system on top of a gigantic dataset by taking a purist “we only look at uses cases individually and ignore the system-level overview” approach (thus, ignoring the general project technical needs that are derived from the size of the data, data integrity and performance requirements) and let me know how well it goes when half way down the project you figure out your system architecture of a single application instance with a database that can’t handle distributed transactions can’t actually deliver on those requirements.
You can refactor code and low level design in a reasonable amount of time, but refactoring system level design is a whole different story.
Of course, in my experience only a handful of shops out there do proper Agile for large projects: most just do the kiddie version - follow the herd by doing famous “agile practices” without actually understanding the process, how it all fits in it and which business and development environments is it appropriate to use in and which it is not.
I could write a fucking treatise about people thinking they’re “doing Agile” whilst in fact they’re just doing a Theatre Of Agile were all they do is play at it by acting the most famous bits.
Yeah I am beyond ready for the next new fad.
Cannabilism.
The next fad will be Thunderdome. Are you ready to go beyond that?
Nice article but the numbers are a lot lower here in the EU.
While there is some pricing increase it’s currently more around 50% and not 100%.
The selected kit is also extremely expensive (350€ was ~300€) - similar kits are available for a lot less (270€ was ~180€) - so I doubt that anyone was buying it in the first place.
I also think it’s not completely AI related but more likely that this is another RAM price fixing scandal happening right now. Pretty much the same that we see today happend in 2017-2018.
Europe is also hit hard. The kit I was about to buy increased from 850 to 2K euro…
Bruh, my whole mid-to-high range gaming PC costs 850 to 2K euro. What is the intended use of such an expensive RAM kit? Is it LLMs again?
Not LLM in this case actually. Virtualization mostly with qubes OS
Scientific applications. My lab has a PC running 196GB RAM for processing 3D and 4D microscopy voxel datasets.
Sounds more like the applications require some serious optimizations to me…
Assuming that you don’t need the absolute tightest timings and highest speed, you can get 192 GB from Corsair for “just” 660 euros where I live, pretty far still from 2000 euros. The speed and timings are the same as the 1300 euro kit, also from Corsair, it’s just that the cheaper kit has no RGB.
So at 2k EUR I’m assuming it’s going to be either more than 192 GB (in which case, is that even a desktop motherboard or are we talking about servers?) or some super high speed RAM.
First crypto miners came for my video cards, then AI came for my DIMMS…
Remember that shitcoin that made HDD prices shoot up like crazy just because?
Now we just need a new nonsensical fad to drive Cases, Motherboards, CPUs and PSUs up and we’ll have come full circleNext tech-sector grift will probably go for our network adapters or some shit…
I’ve got a stack of old 14.4 modems I’ll sell them if they’ll grift on that for a bit.
Remember when “Chia coin” came for our HDDs too?
https://www.tomshardware.com/news/hard-drive-prices-skyrocket-asia-scalpers-making-bank
next the tech-bros will come for you and there will be nobody left to speak up.
Laughs in ddr4…
Im still gaming on a 15 year old cpu and ddr3, lol. Playing a decent amount of new games. Only one that wont run is alan wake 2 and that may be because of linux .
Amd fx 8 core.

It also got affected.
Although it’s ambiguous how much of this is due to AI data centres and how much is the natural ramp-down of DDR4 production.
DDR3 also increased substantially in price a few years after DDR4 became available.
Is DDR3 affected too?
God fucking damnit, can’t I even be poor in peace, JFC!








