• 1984@lemmy.today
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 hour ago

    Im on Linux and it requires just as much memory as it did in 2018. No problem here.

    • pHr34kY@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      46 minutes ago

      I upgraded mine from 16GB to 32GB two years ago beacuse RAM was cheap. I didn’t really need it, and have probably never hit 16GB usage anyway.

      Meanwhile my work Windows laptop uses 16GB at idle after first login.

      Windows has always been wasteful computing, and everyone just pays for it.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        38 minutes ago

        I wish I had a 32gb ram laptop.

        I can have 3 development IDEs open at once, and with all the browser tabs open and a few other programs here and there its stretching the limits on my Mac.

        • pHr34kY@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          27 minutes ago

          I have 32GB on my Windows PC laptop it can’t do three at once.

          Running the backend (java) and the frontend (react native) in intellij uses 29GB RAM, so I must run Android on real hardware over ADB+USB. Running an android simulator pushes it over the edge.

          Also: Laptops are shit. On Windows, the tau is so bad that the cores are throttled straight after boot because the cooling is rubbish. It almost never hits full speed. It can’t survive more than 40 minutes on a full battery. It might as well be a NUC.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            19 minutes ago

            Ya, macs are definitely more efficient with their ram.

            I’ll have Android Studio open for my main work, Intellij Idea for all the backend work, and Xcode when I need to tweak some iPhone things. (edit: usually it’s just 2 of the 3, but sometimes its all 3)

            I also mainly use real devices for testing,and opening emulators if all 3 are open can be a problem, and it’s so annoying opening and closing things.

  • Jhex@lemmy.world
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    2
    ·
    5 hours ago

    This article sucks… I think they felt the need to excuse AI lest they upset corporate masters

    While it’s easy to point the finger at AI’s unquenchable memory thirst for the current crisis, it’s not the only reason.

    Followed by:

    DRAM production hasn’t kept up with demand. Older memory types are being phased out, newer ones are steered toward higher margin customers, and consumer RAM is left exposed whenever supply tightens.

    Production has not kept up with demand… demand being super charged by AI purchases

    …newer ones are steered towards higher margin customers… again AI

    consumer RAM is left exposed whenever supply tightens… because of AI

    • samus12345@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      24
      ·
      3 hours ago

      Just about all electronics older than a year or so have. Even a Switch, which came out 9 years ago, costs more to buy now than it did then!

    • Asmodeus_Krang@infosec.pub
      link
      fedilink
      English
      arrow-up
      31
      ·
      5 hours ago

      It’s truly mental. I don’t think I could afford to build my PC at the same spec today with RAM and SSD prices being what they are.

      • tempest@lemmy.ca
        link
        fedilink
        English
        arrow-up
        13
        ·
        3 hours ago

        I have 128 GB of ddr5 memory in my machine. I paid 1400 for my 7900xtx which I thought was crazy and now half my ram is worth that.

        Never thought I would see the day where the graphics card was not the most expensive component.

  • blitzen@lemmy.ca
    link
    fedilink
    English
    arrow-up
    68
    ·
    6 hours ago

    Apple over here not raising their RAM prices because they’ve always been massively and unjustifiably inflated. Now, they’re no longer unjustifiably inflated.

    • totesmygoat@piefed.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 hours ago

      They also buy allotments months in advance. Just waiting to see how much apple will charge soon.

    • mushroommunk@lemmy.today
      link
      fedilink
      English
      arrow-up
      35
      arrow-down
      1
      ·
      5 hours ago

      I dunno. “AI companies bought literally everything” seems like an unjustifiable reason still.

      • blitzen@lemmy.ca
        link
        fedilink
        English
        arrow-up
        28
        ·
        5 hours ago

        Perhaps. I guess my point is they no longer are as out-of-line with the rest of the market. Comment meant as a backhanded “compliment” toward Apple.

    • eli@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 minutes ago

      Same here, running my 3700X with a 3080.

      I should’ve pulled the trigger on the 9800X3D last year like I wanted, but thought it was just too expensive.

      Welp.

      • Wildmimic@anarchist.nexus
        link
        fedilink
        English
        arrow-up
        2
        ·
        25 minutes ago

        I agree. I recently swapped out my aging 2600x6core for a 5950x32core processor and upgraded from an 3070ti to a 5070 (well actually more of a sidegrade - my vram was simply too small). before this, my system had already a few years where there wasn’t much difference regarding gaming - In the current configuration and the glacial speed gaming developed i’d say i have a decade before upgrades are really needed.

      • garretble@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 hours ago

        Honestly, my PC at this point plays FFXIV and that’s basically it. And I’m OK with that.

    • potoooooooo ✅️@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      edit-2
      5 hours ago

      Don’t worry, I suspect Cloud Terminals will be super cheap. You won’t even need that ol’ thing anymore!

  • palordrolap@fedia.io
    link
    fedilink
    arrow-up
    13
    ·
    5 hours ago

    The DDR4 sticks I got 18 months ago now cost 300-400% the price they were, so it’s not just DDR5.

    … and I just realised the title doesn’t actually mean “DDR5 prices”, but that was an easy misinterpretation on my part, so I guess I’ll post this anyway.

      • Zorsith@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 hours ago

        My standard cycle for building is 5 years, and i just built (probably overspecced) with a 9800x3d and a 9070 XT. I’m sitting pretty for a while IMO.

        • Railcar8095@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 hours ago

          Same CPU, but 3070 RTX and 64 gigs of ram and 4TB 990 pro mvme. Looking back, I bought at the very beginning of the hike, and all because a very good RAM discount.

          Might change the GPU for something more Linux friendly if prices go down or second hand, but my idea is to have 5-10 years honestly.

      • 9point6@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        6 hours ago

        I more meant now they’re not being made because Micron recently killed the Crucial brand to focus supply towards data center customers

        • Prove_your_argument@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 hours ago

          I’m well aware, but everybody knows the HBM demand will dry up eventually and that eventually the consumer market will be worth trying to profit from again.

          They just want to manipulate the consumer market to maximize margins. If they can get memory prices to stay at 200-300% for a while, they can up the prices they charge and raise margins to stratospheric heights not before seen on the consumer market. All manufacturers jump on stuff like this when they can get away with it.

          Memory manufacturers still order from micron directly for their own branded chips. Those margins will increase for all parties. Ai data center demand is like Christmas for the entire industry. No pricing is transparent and every vendor is profiteering.