Hi, I’m building a homelab watercooled unix server. I don’t want to buy expensive overpriced pre-mixes from ekwb or aquatuning. What cooling solution do datacenters use for water cooling?

What is the chemical solution? Does anyone know?

  • gray@pawb.social
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    1
    ·
    1 day ago

    There’s basically no reason ever to do water cooling on a home system unless you’re trying to do overclocking.

    Air is cheaper, more reliable, and typically quieter because you don’t need pumps.

  • the_q@lemmy.zip
    link
    fedilink
    English
    arrow-up
    22
    ·
    1 day ago

    I hate that tech YouTubers have made people think they need liquid cooling and RGB to do anything PC related. You don’t need to liquid cool your homelab. Period.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 day ago

      You don’t, but it’s considerably quieter to use a liquid cooler on current high-end CPUs because of the amount of heat they dissipate. My current CPU has a considerably higher TDP than my last desktop’s. I finally broke down and put an AIO cooler on the new one, and all the fans on the radiator can run at a much lower speed than my last CPU because the radiator is a lot larger than one hanging directly off the CPU, can dump heat to the air a lot more readily.

      The GPU on that system, which doesn’t use liquid cooling, has to have multiple slots and a supporting rail to support the weight because it has a huge heatsink hanging on a PCI slot that was never intended to support that kind of load, and the fans are far more spun up when it heats up.

      The amount of power involved these days is getting pretty high. My early PCs could manage with entirely passive cooling, just a heatsink. Today, the above CPU dumps 250W and the above GPU 400W. I have a small space heater in the same room that, on low, runs at 400W.

      Frankly, if I had a convenient mounting point in the case for the radiator, with the benefit of hindsight, I’d seriously have considered sticking an AIO liquid-cooled GPU in there — there are a few manufacturers that do those. The GPU is a lot louder than the CPU when both are spun up.

      I will kind of agree on the RGB LEDs, though. It’s getting obnoxiously difficult to find desktop hardware that doesn’t have those. My last build, I was having difficulty finding DIMMs that didn’t have RGB LEDs; not normally a component that I think of anyone wanting to make visible.

      I’m kind of wondering whether we’ll get to the point where one just has a standard attachment point for liquid and just hooks the hardware’s attachment into a larger system that circulates fluid. Datacenters would become quiet places.

  • jubilationtcornpone@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    1 day ago

    What cooling solution do datacenters use for water cooling?

    They typically don’t. The servers are air cooled and the room is conditioned with a Liebert or similar HVAC system. Liquid cooling servers is not practical or warranted for most situations.

      • Glitchvid@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 day ago

        Expanding on that, direct water cooling becomes more common the higher power density the racks are.

        So as you get into 35kW+ racks it becomes the only way to get that much heat out, lots of GPU compute racks are water cooled by default now, the El Capitan super computer is entirely cooled through direct liquid interfaces, for example.

  • Meron35@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    17 hours ago

    Water cooling is typically much more complex and expensive than air cooling, and is mainly attractive because of space limitations. The same applies to data centers. IBM’s mainframes have a liquid cooled version mainly targeted towards users wishing to get the most out of their data center space before upgrading sites. These ship without coolant, and simply ask the user to “just add water,” i.e. just demineralised/distilled water.

    Sure Mainframe ain’t dead, but what about that toilet water? | Aussie Storage Blog - https://aussiestorageblog.wordpress.com/2021/04/07/sure-mainframe-aint-dead-but-what-about-that-toilet-water/

  • qupada@fedia.io
    link
    fedilink
    arrow-up
    5
    ·
    1 day ago

    In ours, the coolant is referred to as “PG25” (distilled water with 25% propylene glycol, plus corrosion inhibitors and other additives). It’s widely available, and pre-mixed so it just gets poured straight in.

    Your problem is going to be quantity. it might be cheaper per unit, but buying less than a 200 litre drum (if not a 1000 litre IBC) will prove to be a challenge.

    I’d suggest a rethink, honestly.

  • Romkslrqusz@lemmy.zip
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 day ago

    I don’t have a direct answer to your question, but expensive premixes aren’t really necessary outside of achieving a certain aesthetic.

    Distilled water, biocide and corrosion inhibitors. Make sure your metals are all compatible.

  • 51dusty@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    edit-2
    21 hours ago

    is liquid cooking cooling really necessary? critical data centers I have worked in use swamp coolers. cheaper, more efficient, more reliable, uses same water as your house…

    edit: d’oh! :)

    • nocturne@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      1 day ago

      Are you sure they use swamp coolers? Also known as evaporative coolers, they add moisture into dry air, making the air they are cooling very humid and only slightly cooler.

      • ramielrowe@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        1 day ago

        Practically all even semi-modern DCs are built for servers themselves to be air cooled. The air itself is cooled via a heat exchanger with a separate and isolated chiller and cooling tower. The isolated chiller is essentially the swamp cooler, but it’s isolated from the servers.

        There are cases where servers are directly liquid cooled, but it’s mostly just the recent Nvidia GPUs and niche things like high-frequency-trading and crypto ASICs.

        All this said… For the longest time I water cooled my home lab’s compute server because I thought it was necessary to reduce noise. But, with proper airflow and a good tower cooler, you can get basically just as quiet. All without the maintenance and risk of water, pumps, tubing, etc.

        • 51dusty@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          23 hours ago

          a chiller is not a swamp cooler.

          picture a fan with a wet sponge in front of it… that is a swamp cooler.

          • ramielrowe@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            18 hours ago

            Yea, it’s the combo of the chiller and cooling tower is analogous to a swamp cooler. The cooling tower provides the evaporative cooling. The difference is that rather than directly cooling the environment around the cooling tower, the chiller allows indirect cooling of the DC via heat exchange. And isolated chiller providing heat exchange is why humidity inside the DC isn’t impacted by the evaporative cooling. And sure, humidity is different between hot and cold isles. That is just a function of temperature and relative humidity. But, no moisture is exchanged into the DC to cool the DC.

            Edit: Turns out I’m a bit misinformed. Apparently in dry environments that can deal with the added moisture, DCs are built that indeed use simple direct evaporative cooling.

      • 51dusty@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        23 hours ago

        yes. I programmed and integrated swap coolers at Amazon data centers. when the cool air hits the hot aisles the humidity goes down.

        • nocturne@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          23 hours ago

          That is awesome! I had no idea swamp coolers could cool that well. The one in my shop can barely drop the temp 10-15 degrees below outside (on a good day). Sorry for doubting you, so used to people outside of arid climates not knowing what a swamp cooler is.

          • 51dusty@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            22 hours ago

            10-15 degrees is all you need to keep a “cold aisle” at 85degf, most places, on the worst day.

            IIRC Amazon figured out that individual components could actually run hotter within an acceptable replacement window.

            higher equipment replacement is more than offset by the fact they don’t have to do refrigerant based cooling which makes daily operation ridiculously cheap… no pumps or complicated mechanical devices to produce cooling… no people with special skills to maintain them, etc.

            • nocturne@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              21 hours ago

              That 10-15 (in my case) relies on no extra heat. When I have game nights it gets pretty toasty inside with 5 or so extra bodies in the shop.

              My last experience with a server room was in 2002 or 2003, and the rooms were kept in the mid to low 60s.

              • 51dusty@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                22 hours ago

                no doubt.

                I remember when you’d put a jacket on before you went in the halls… but now everyone wears shorts.

      • morbidcactus@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        1 day ago

        Industrial cooling towers are usually evaporative in my experience, smaller ones are large fans moving air over a stack of slats that the return water is sprayed or piped over and the collects in well for recirculation, larger ones afaik (like what you’d see at power plants) operate the same idea. Top ups and water chemistry is all automated.

        Those systems have operation wide cooling loops that individual pieces of equipment tap into, some stuff uses it directly (see that with things like industrial furnaces) but smaller stuff or stuff that’s sensitive you’ll see heat exchangers and even then the server & PLC rooms were all air cooled, the air cons for them were all tied into the cooling water loops though.

        From a maintenance POV though, way easier to air cool, totally seen motor drive racks with failed cooling fans that have had really powerful external blowers rigged up to keep them going to the next maintenance window. Yeah, industrial POV but similar idea.

        • 51dusty@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          23 hours ago

          critical data centers use swamp coolers because they don’t have to treat the water or expose it to contamination from outside. they use straight domestic water… super cheap.

          if the conductivity gets too high, they dump the basin and fill with fresh… rinse and repeat.

    • over_clox@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      liquid cooking?

      Haha, stupid sexy autocorrect, or honest typo, either way I got a good chuckle!

  • lakeeffect@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    I have no idea what a data center would use and I haven’t over locked since the 90s, but water wetter is what I used then.

  • IsoKiero@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    If you are willing to deal with potential galvanic corrosion, condensation, leaks, replacing fluid every now and then and so on I suppose you could use red or yellow radiator coolant from your local car service shop. It has all the properties you’ll want from a coolant liquid, but as others have already mentioned, it’s not really worth the hassle. Atleast if you’re not running something really power hungry, like GPU farm. And even with liquid you have the very same problems than air, you need the heat to go somewhere. So either have very long pipes (and pumps to match them) so you can have the radiator on a different room/outside or big fans to move the hot air away from radiator.