Liquid Cooling High-End Servers Direct to Chip, Rear Door, and Immersion Cooling

Sdílet
Vložit
  • čas přidán 15. 06. 2024
  • Main site article: www.servethehome.com/liquid-c...
    STH Merch on Spring: the-sth-merch-shop.myteesprin...
    STH Top 5 Weekly Newsletter: eepurl.com/dryM09
    STH Forums: forums.servethehome.com/
    We get into cooling next-gen servers. Specifically, with the help of Supermicro, we are able to show off direct to chip liquid cooling, rear door heat exchanger, and liquid immersion cooling solutions.
    ----------------------------------------------------------------------
    Timestamps
    ----------------------------------------------------------------------
    00:00 Introduction
    04:40 Rear Door Heat Exchanger
    07:18 Liquid Immersion Cooling
    13:15 Direct to Chip Liquid-Water Cooling
    15:47 Liquid Cooling Performance and Power Consumption
    22:53 Quick STH Update
    21:45 Implications for 2022 Servers and Beyond
    ----------------------------------------------------------------------
    Other STH Content Mentioned in this Video
    ----------------------------------------------------------------------
    - CXL Primer: • CXL in Next-gen Server...
    - GB Era Server CPUs: • Why Server CPUs Are Go...
    - EDSFF: • EDSFF E1 and E3 to Rep...
    - PhoenixNAP Data Center Tour: • A Fun Data Center Tour...
    - STH Blue Door Studio Tour: • Ultimate Zoom and YouT...
  • Věda a technologie

Komentáře • 181

  • @tommihommi1
    @tommihommi1 Před 2 lety +117

    All that knowledge collected by people doing custom loop overclocking is finally paying off :D

    • @RickJohnson
      @RickJohnson Před 2 lety +10

      Funny how usually server tech trickles down into the home PC market, but this is a trend trickling the other way - just like how racing innovations trickle down into daily driven cars!

    • @suntzu1409
      @suntzu1409 Před 2 lety +2

      @@RickJohnson basically its evolving but backwards

    • @Felix-ve9hs
      @Felix-ve9hs Před 2 lety +12

      @@suntzu1409 when you think about that pc watercooling startet with enthusiasts cooling their Athlon's and Pentiums with Car radiators, garden hoses and aquarium pumps, its even funniert xD

    • @edwardallenthree
      @edwardallenthree Před 2 lety +7

      @@Felix-ve9hs exactly! My first water cooling rig was an Athlon with an industrial pump, a five gallon bucket, and a huge heater core. (Early 2000s). Ok, I didn't know what I was doing, but it worked. Ironically, with the valves in the back, it looked like these solutions too.

    • @suntzu1409
      @suntzu1409 Před 2 lety +5

      @@Felix-ve9hs i guess trains, industries and factories using water cooling since like 1900 (or earlier) make it even funnier XD

  • @lels3618
    @lels3618 Před 2 lety +43

    we NEED to start using all of this heat! the city of Lucerne in Switzerland is building a data center that will heat part of the city inside of a retired bunker - i think thats pretty epic :>

    • @axiom1650
      @axiom1650 Před 2 lety +10

      Pretty common in the north to put the waste heat on the district heating. The DC gets paid per unit of heat inserted so it's a logical/economical thing to do.

    • @Primoz.r
      @Primoz.r Před 2 lety +2

      Waste heat can't all be used, the temperature of the heat is a big factor as well. Data center heat is not particularly high in temperature...

    • @night951521
      @night951521 Před 2 měsíci +1

      @@Primoz.r Very much depends on the cooling system used. Immersion cooling system can work to very high CHW temperatures, with the return reaching as high as 50degC. Which is close to your standard LTHW water temperatures. In fact it would be too high if the end user was using underfloor heating for their heating.
      Even with the traditional systems, exporting CHW at 28 and then using heat pumps to lift the temperature up provides great benefits if you compare it to heating the water from the mains gives you (approx 10deg)

  • @gamershadow1
    @gamershadow1 Před 2 lety +19

    I’ve always been super interested in server watercooling. Thank you for making this!

  • @PalCan
    @PalCan Před 2 lety +4

    Your excitement is contagious man. Thanks for bringing to our fingerprints the bleeding edge in tech.

  • @TheLaurentDupuis
    @TheLaurentDupuis Před 2 lety +13

    Immersion cooling was used by the old Crays supercomputer of the 80s.

  • @joanf
    @joanf Před 2 lety +6

    It feels amazing to open the video and just enjoy some super cool water cooling, always been a custom loop guy and now this knowledge can be applied to servers. Just love it! Keep up with the amazing job you’re doing! 👍

  • @excitedbox5705
    @excitedbox5705 Před 2 lety +3

    Liquid cooling makes so much sense and I think in the future we will see much more direct to outside liquid cooling. It makes no sense putting several space heaters in a room and then using your AC to cool the room. It is much more efficient to have an outdoor radiator and dump the heat directly outside.

  • @CheapSushi
    @CheapSushi Před 2 lety +30

    So glad STH exists to show off this stuff to us regular enthusiasts. This is damn awesome. IBM's Z System liquid cooled 5GHz CPU's were what got me interested a long time ago on maxing performance. I wanted to be as cool as those mainframes. Imagine if STH could do a tour of IBM's Z systems!
    BTW were those Noctua fans on the rear? haha saw that they were brown(ish?)

  • @SIC66SIC66
    @SIC66SIC66 Před 2 lety +6

    I work with a start-up company together with Dell and Shell to deploy immersion cooling systems. Its really cool but it definitely has its disadvantages. You cant use any thermal paste for example, we use indium foil as an interface for the heatsinks.
    Letting a server drain for 30 min is enough to service it. The components do not need to be cleaned. You can just remove and install a CPU in a socket, both covered in oil, without any problem.

    • @fat_pigeon
      @fat_pigeon Před 2 lety +1

      That's because the thermal paste is soluble in the cooling oil, right? Out of curiosity, why indium rather than galinstan (sometimes used by enthusiasts)?

    • @mattyOvr
      @mattyOvr Před 2 lety +1

      Send Rolf my regards! :)

  • @thejo6331
    @thejo6331 Před 2 lety +8

    nice! I love seeing this type of next-gen stuff. Keeping CPU/GPU die liquid-cooled to a lower temp also helps to improve efficiency by reducing leakage current within the silicon.

  • @FakeName39
    @FakeName39 Před 2 lety +2

    This video is severely underrated. Well put together and informative without being boring. Would of loved to see the inside of the server because.. well its cool. Now if only i can setup a central PC in a closet, and run cables to my monitor/keyboard etc at my desk, and then kick out any heat to the outside that would be great.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety

      We just did a teardown of an 8x A100 system from another vendor.

  • @xSeiryu
    @xSeiryu Před 2 lety +3

    Liquid cooling rack servers what a time to be alive I love it!

  • @scheimong
    @scheimong Před 2 lety +34

    1kW+ practically means having one of those commercial microwave ovens that you see in convenience stores, in each U of your rack. Difference is instead of food what you're frying is thousands of dollars worth of electronics... It honestly surprises me how air cooling was able to take us this far.

    • @fat_pigeon
      @fat_pigeon Před 2 lety +2

      You don't need special commercial ones. A regular household microwave is >1000 W.

    • @scheimong
      @scheimong Před 2 lety +1

      @@fat_pigeon I don't know about that. Where I live 750/800W is more common for household microwaves.

    • @RussSirois
      @RussSirois Před 2 lety +1

      @@scheimong Maybe in European (230/240v) countries. In North America where we're running 120v, 1000-1200w is very common (both standalone units and over-the-range)

    • @2xKTfc
      @2xKTfc Před 2 lety +2

      @@RussSirois What does the voltage have to do with that? You can easily pull 1kW from a circut in Europe, too, if that's what you're getting at.

    • @fat_pigeon
      @fat_pigeon Před 2 lety +1

      @@scheimong I just checked mine (ordinary microwave, US model). Rated 1100W. Regardless, it's a good amount of heating, so your overall point stands.

  • @DrivingWithJake
    @DrivingWithJake Před 2 lety +4

    This is where we see the dipping systems more often. 2U 4 node systems also makes it easy to work on as the nodes can pull out of the back without having to remove the whole system form a tank.

  • @dillonhicks3781
    @dillonhicks3781 Před 2 lety

    Good to see the subs going up and up, happy for you.

  • @hvac1904
    @hvac1904 Před 3 měsíci +1

    We work on a lot of liquid cool data centers super cool technology and it’s just growing so much faster and faster! Love the content.

  • @Christopher_S
    @Christopher_S Před 2 lety +1

    Supermicro are awesome to have helped you do this. Super interesting, great job.

  • @jmonsted
    @jmonsted Před 2 lety +8

    Our rear door install wasn't done quite right and one of them took a leak on me. Just one of the many mistakes our vendor made.

  • @P8qzxnxfP85xZ2H3wDRV
    @P8qzxnxfP85xZ2H3wDRV Před 2 lety +4

    Components that were immersed in 3M Novec come out completely clean. In fact the fluid is specifically used for cleaning also.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety +2

      Many of the fluids require secondary cleaning stations. One of the fun parts on there being many different fluids in use.

    • @Waitwhat469
      @Waitwhat469 Před 2 lety

      Now you just have to be able to afford not to recapture that novec fluid.

  • @jfbeam
    @jfbeam Před 2 lety +5

    The rack-door systems I've seen were basically moving the evap coils of the HVAC into the rack door. Removing the air middleman, running the liquid lines direct to the chips would be a great deal more efficient, if a lot more messy.
    The one you show is rather useless as all the heat still has to be dumped into air, pulled back out of the air into a refrigerant (which might be CW), to be pumped out of the room/building. If you've put the waste heat into a liquid (or gas), skip the extra steps and pump _that_ out of the space.

    • @arthurmoore9488
      @arthurmoore9488 Před 2 lety +1

      I think it's an airflow thing. Those doors aren't just liquid coolers, they're also supplementing the server fans. Plus, at higher heat loads, they might have had to completely re-design their cooling setup to keep hotspots from emerging. Put another way, it ensure equal distribution of cooling. Direct refrigerant lines are more efficient, but are in many ways also more complex than water cooling.
      Many electric cars use this approach. The compressor feeds the cabin's evaporator and a liquid heat exchanger. Chilled liquid is then used to cool the batteries.

    • @axiom1650
      @axiom1650 Před 2 lety +1

      Adoption is slow due to requiring a complete overhaul of the datacenter if it was not built for that purpose. I guess like this they try to offer a lower threshold, which makes little sense imo.

  • @post-leftluddite
    @post-leftluddite Před 2 lety +5

    I was kind of hoping the video would go more in depth on the actual water cooling hardware, specifically the direct to chip blocks, manifolds and heat exchangers

  • @SmokeytheBeer
    @SmokeytheBeer Před 2 lety +1

    Puts a whole new meaning on "hot hardware"

  • @johnmijo
    @johnmijo Před 2 lety +1

    Thanks Patrick, great stuff as always :)

  • @jeremybarber2837
    @jeremybarber2837 Před 2 lety +1

    Ah liquid cooling… Thanks for the video, very fun to see where the high end is headed in the server space.

  • @mdippa
    @mdippa Před 3 měsíci +1

    Great video. Really well explained.

  • @loldude0
    @loldude0 Před 2 lety +4

    oo yess,a new video!gonna watch this over snacks now

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety +1

      Yes. Has been awhile with the whole move happening. A makeshift studio for this one.

    • @loldude0
      @loldude0 Před 2 lety

      @@ServeTheHomeVideo haha, as long as you keep posting I'll always enjoy your videos : D

  • @dankodnevic3222
    @dankodnevic3222 Před 2 lety +2

    Thanks! Great video! It would be great if You can investigate 230 and 326 VDC enabled gear. With many solar and battery powered data centers and homes also, DC supply becomes more common (i am not talking about traditional telco -48V DC). Then we have to use inverters to get AC. Current SMPSs usually converts that AC to high voltage DC, then DC to low voltage DC. Skipping DC to AC and again to DC improves efficiency. You can start with observation of input AC voltage ranges of current SMPS and then check if DC is allowed as main.

  • @Nobe_Oddy
    @Nobe_Oddy Před 2 lety +4

    this 'studio' setting is just fine, I think it's pleasant :)
    anyway, YES liquid cooling IS THE FUTURE! But I think there will be GREAT OPPORTUNITIES in developing liquids that can soak up the heat (and also spit it out) MUCH more efficiently than water.... Yea they prolly already exist in a laboratory somewhere, but who's testing them all? :)

  • @nekomakhea9440
    @nekomakhea9440 Před 2 lety +10

    I wonder if there's a way to do heat recovery cogeneration to generate some power or industrial heat from all that hot fluid being pumped around.
    The water temperature probably is well below 60C, but 40kW+ of hot water per rack seems like a lot of energy to just dump outside.

    • @axiom1650
      @axiom1650 Před 2 lety +10

      In the North the waste heat gets 'pumped' up with heat pumps to heat water to ~80C to put on the district heating systems. Generating power from such low temperature delta has an efficiency in the range of 5-10% so not really worth it.

    • @TheStefanskoglund1
      @TheStefanskoglund1 Před 2 lety

      @@axiom1650 the usage of heat pumps in district heating isnt that large.

    • @System0Error0Message
      @System0Error0Message Před 2 lety +2

      the issue with this is the gap of temperature. The bigger the gap the more power you use. All cooling in DCs are compressor based on the higher end, basically like your AC. If your ambient temperature in the datacenter is 15C and the boiling point at 105C, the energy needed to create that difference using an AC increase exponentially the bigger the gap. The best way to explain this is the amount of energy required to move said amounts of energy at a larger difference then smaller difference that power generation is not feasable and a multi loop compressor cooling to keep that same efficiency tends to fail often (not mechanically reliable). This is why this isnt used.

  • @patrickprafke4894
    @patrickprafke4894 Před 2 lety +1

    I have for ages, never understood why servers aren't liquid cooled. Much cooler and quieter.

  • @juri14111996
    @juri14111996 Před 2 lety +2

    just pull a little bit of vacuum on in the cooling loop. this way you only need to monitor the vacuum at one point.
    it there is a leak air will get in, but no water out.

  • @TommyApel
    @TommyApel Před 2 lety +2

    While liquid cooling is nothing new in the serverspace it's nice to see that some of the mainstream brands are starting take this serious. The next hurdle is get these solutions into the datacenters, for the past 10 years or so that I've been working with liquid solutions for servers the stop block has been the datacenters it has always ended up being that the customer had to build their own datacenter as typical co-location centers just run away screaming as soon as you mention liquid in any extend even when we started using the 3M engineered liquids which are totally dielectric. But it will be interesting to see if this finally starts to catch on.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety

      Slow process. That is what we are working on now.

    • @TommyApel
      @TommyApel Před 2 lety

      @@ServeTheHomeVideo I remember having this discussion about die cooling with SMC about 8 years ago and back by then they basically told me that it was never going to be a thing so it's really nice to se that they are finally starting to come around to it.

  • @epsilontic
    @epsilontic Před 2 lety +2

    The screenshot of the Nvidia-smi tool at 16:49 is insightful. The A100 80 GB versions are snown with 500 W max. power. However, the datasheet lists a TDP of 400 W. It would be interesting to see, how much power these cards actually draw under full load. The idle value shown at about 60 W is reasonably low.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety

      They were running at 496-500W during the linpack runs. NVIDIA does not actually list the 500W 80GB A100, but they are available from almost every vendor. We recently discussed them in the last 8x A100 system we reviewed.

  • @ckuhtz
    @ckuhtz Před 2 lety +4

    Who makes the hoses/quick disconnect connectors for the direct to chip cooling? For the 500W A100 chassis you’re showing in the episode for example.

    • @-Primer-
      @-Primer- Před 2 lety

      I use these, very well made. koolance.com/help-quick-disconnect-shutoff-couplings

  • @denvera1g1
    @denvera1g1 Před 2 lety +3

    Ok, hear me out.
    Phase change cooling liquid that boils at 75c Sealed tanks for the servers. -Vent- Pipe that gas to the roof, where it condenses, and drains back down into the tank. The only pump/fan that would be needed, if anything, is the fan to to bring the gas to the condenser, and the fans to cool the condensor. Everything else is either powered by the fluid boiling away and carying in fresh liquid, the hot gas rising, and the consensed liquid falling
    No cleaning needed, just pull the server out and heat to 76c

    • @Waitwhat469
      @Waitwhat469 Před 2 lety

      czcams.com/video/YyKIZPuepl8/video.html
      This seems to be the way to go to me too. you could keep the same rack design but like you said shutdown, turn off the fluid intake, heat to evaporate the fluid, and pull it out for maintenance.
      Though the only thing I can't figure out is how to use it with something like OpenCompute drawers, but maybe those just need to be handled somehow else (could do cooling blocks as part of the chassis, that pipe coolant (water even here) to a heater block that is submerged in the same coolant as the rest of the system.
      Though they are also working something too:
      www.opencompute.org/projects/acs-immersion
      Though as cool as opencompute is I never can figure out how to see what progress they've made on some of these.

    • @denvera1g1
      @denvera1g1 Před 2 lety +1

      @@Waitwhat469 With phase change it *could* be very low pressure and just use regular plumbing and racks. Using a metering valve that only allows as much liquid in as needed. with each server getting a high side for vapor, and low side where it gravity feeds liquid into the exchange blocks, where it evaporates, and pushes itself into the high side
      The problem with using a standard rack instead of an open compute rack is that the server would probably, at minimum need to be 2u, just to make sure that as the fluid boils, it runs through the high(gassious) side of the loop
      There could *probably* be some sort of HEPA like filter that allows the gassious fluid thrtough but not the liquid, but i dont know if it would be high flow enough when being impeeded by blocking the liquid

  • @thetj8243
    @thetj8243 Před 2 lety +1

    I like the kitchen studio 😀
    Interesting topic at all. Central liquid cooling with a big radiator for more than one of these units sounds really heavy... But on the other hand much of the power and heat exists already and is taken care of by air conditioning... Maybe the a/c heat exchanger could also cool some liquid for direct chip cooling 😎

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety +1

      Ha! This is actually a small office with two desks (hopefully for video editors)

  • @libertarian1637
    @libertarian1637 Před 2 lety

    Immersion cooling is also done everywhere in the US where electrical transformers are immersed in mineral oil as a meat joe of heat sinking to the transformer cases; it provides both a method reducing heat as well as electrical insulation. Liquids tend to be better at thermal transfer than air.

  • @MisterRorschach90
    @MisterRorschach90 Před 2 lety +2

    If I had a rack in my home that looked like that I would be a happy boy. Though I would prefer metallic conduit tubing and metal fittings to make it look super industrial and sci-fi.

  • @kens97sto171
    @kens97sto171 Před 2 lety +1

    Interesting topic. Kind of funny how the liquid cooling going from your average gaming computer at home to servers. Immersion cooling is something that was done pretty long ago with gaming machines too people would get mineral oil and some merge their PC in aquariums fill it with oil.
    Something you didn't mention but I would assume would be radically different would be the noise level in a data center it was all liquid cooled versus air-cooled. Those servers are very very noisy with those fans going and when there's a whole bunch of them going at the same time it's practically deafening. You could eliminate most of that noise by going to liquid cooling.

  • @phurious_george
    @phurious_george Před 2 lety +2

    Interesting video, would like to hear a take on liquid cooling servers at home and various setups, etc. I'm personally working through a complex one myself and have found limited information in the home space regarding rack liquid cooling.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety +2

      May be something we tackle later this year/ into next year.

  • @d00dEEE
    @d00dEEE Před 2 lety +3

    As you talked about in the DC tour video, they raise the humidity of the cooled air to be ~50% RH to reduce ESD danger. To eliminate condensation on a direct-to-chip coolant line, you'd want to drop the RH to as low as possible. How do the data centers deal with these conflicting needs? Is the coolant at or above ambient temperature, like in a home watercooled setup? I was under the (mistaken?) impression that DCs ran sub-ambient...

    • @Angel2kinds
      @Angel2kinds Před 2 lety +2

      From my personal (technician for mass server hosting company, access to few data centers) - data centers usually run at around 20c ambient temp.
      I'd assume water cooling systems would be above ambient

    • @d00dEEE
      @d00dEEE Před 2 lety

      @@Angel2kinds Wow, that warm these days? Back in the early '80s I did datacomm at Burroughs, our office was next door to a little in-house DC with a couple big mainframes and they kept it 11-12C. It was FREEZING in there with the constant blast of air up through the floor.

    • @Angel2kinds
      @Angel2kinds Před 2 lety

      @@d00dEEE oh, I suppose its different if you got massive DC vs small in-house one
      The one where we got most of our infra (including few 22KW racks) has 20° ambient
      Even fairly powerful blade servers with dual 280W cpus only require to be sub 30°C on intake.
      Got no experience running GPUs tho.

  • @GvozdenNeskovic
    @GvozdenNeskovic Před 2 lety

    Love the video, but I wish the Linpack tests were carried out using the same problem size (N).

  • @obtFusi
    @obtFusi Před 2 lety

    @sth one Solution would be so easy: Backside cooling. There was lastly a mainboard in the news (consumer, not enterprise, but anyway), you're creating two zones. When on the waterside something break, it can't flush your components.

  • @seitenryu6844
    @seitenryu6844 Před 2 lety

    Are these liquid cooled systems typically using DI, or are synthetics more prevalent? It's high pressure fluid, right? What's the cost differential air vs liquid rack, assuming similar performance?

  • @JmbFountain
    @JmbFountain Před 2 lety +1

    This is already a concern for us. We have racks that run ~4 24 core Xeons per U our cooling is similar to back door cooling, but more integrated into the rack/datacenter. The hot air is sucked in, cooled down and then forced into the front of the rack again.

  • @poldelepel
    @poldelepel Před 2 lety +6

    Water is just so much better at transporting heat than air!
    In data centers it is often: server heat > air > CRAH > (chilled) water.
    So why don't go directly from server heat to water...!?
    Direct to chip water cooling would be such more efficiënt, imho.

    • @iMadrid11
      @iMadrid11 Před 2 lety +1

      The simple answer is a water leak can destroy electronics. In a data center with servers stack on top and beside each other in a rack. You can actually destroy other servers in case of a water leak. Air cooling is less complex and if a server fail it doesn’t affect other servers. Water cooling data center servers do have its benefits but it’s more complex to deploy. We need more safety innovations and operation costs to come down in par with air cooling. Before we could see water cooled servers becomes a standard.

  • @crazy_human2653
    @crazy_human2653 Před 2 lety +2

    with bath cooling it was being used to cool cray computers back in the early days of computers

  • @rabidbigdog
    @rabidbigdog Před 2 lety

    In the early 2000s, I emailed our product manager at HP that they should encourage a liquid cooling standard into the rack mount equipment. Slide your generic server into the generic rack and the couplers automatically link up .... Damn - should have taken out a patent. :)

  • @andreavergani7414
    @andreavergani7414 Před 2 lety +1

    Fascinatig!

  • @bigappleplug6021
    @bigappleplug6021 Před 2 lety

    Lol sorry to comment twice
    But had to laugh
    I have a sysrack
    3 notable piece of equipment within is a -dell r720xd with a tesla v100 hba array
    -hp dl385 with an epyc 7552 with an hba array
    - trip lite rack ac
    I installed an variable speed exhaust fan with louvers on top of the sysrack (exhaust )and on the back hot zone in the sysrack. Hot air pulled towards the rear and up and out on the top. Cool air pulled towards the front and up and out on the top. 2 zones separated by insulation and blank panels to facilitate air flow/temp zones.
    19-20 celcius is maintained at the front of the rack.

  • @dashjinn
    @dashjinn Před měsícem

    Did you get a chance to compare the Power consumption with Immersion Cooling or do you have any metric from Supermicro about the power consumption comparing immersion cooling with the others?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před měsícem

      1-phase immersion is still fairly close to cold plates for hot components. 2-phase tends to be better but a lot of the research stopped with the whole PFAS thing. www.servethehome.com/2-phase-immersion-cooling-halted-over-multi-billion-dollar-health-hazard-lawsuits/

  • @ElijahPerrin80
    @ElijahPerrin80 Před 2 lety +3

    I cant help wonder if a different working fluid would be better for heat transfer that have lower boiling points to exploit phase changes with a closed loop immersion cooling system.

    • @jfbeam
      @jfbeam Před 2 lety +1

      "Water" cooling is rarely done purely with water. (it's not a boiler after all) Usually, it's some mix of glycol and water. (similar to automotive antifreeze) Of course, there are liquid cooling systems that use ammonia (not recommended around people), nitrogen, or CO2. (among other more exotic things... Cray used fluorinert)

    • @fat_pigeon
      @fat_pigeon Před 2 lety +1

      My guess is that it's just hard to beat water, which has many desirable properties as a coolant. Water is cheap, non-flammable, non-toxic, and only mildly corrosive, and has an unusually high heat capacity. So you *can* improve the effective heat capacity by using a fluid that boils between ambient temperature and Tdie, but I don't know of any that don't significantly compromise one of the other properties.
      Crazy idea: use a slurry/suspension of particles that *melt* slightly above room temperature in water. The particles would melt and re-freeze, absorbing/releasing additional energy, while remaining suspended in the water. This might run into problems with clogging, and increases the viscosity so it might make pumping less efficient.

  • @nobiggeridiot
    @nobiggeridiot Před 2 lety +3

    So uh, what kinda hashrate could one get on that LC A100 system ? Just for S&Gs. Immersion cooling albeit not as optimum and a pain in the ass, it's still that kinda badass that makes cool sci-fi movie material. Digging the vids no matter the set.

    • @axiom1650
      @axiom1650 Před 2 lety +1

      8x215MH/s on the A100 80GB 400W, I don't think the memory is clocked higher on the 500W version so pretty much that.

  • @mikeb.2166
    @mikeb.2166 Před 2 lety +1

    This is pretty cool. I wonder with all the push to cloud, if massive cloud DCs are contributing to environmental issues in any way or are they neutral, meaning the DC heat is transferred to someplace its needed like an office building, warehouse, etc. in a cool climate that would otherwise need fuel for heat. Will cities have hot water lines fed by DCs as a public utility similar to New York City with the old steam lines for moving around heat. It seems all this DC heat could be a resource if planned correctly.

    • @Waitwhat469
      @Waitwhat469 Před 2 lety

      Well definitely not neutral by default just with the energy usage alone

    • @MrHics
      @MrHics Před 2 lety

      Yep. The larger DCs in freezing cities do this. Can help warm a medium size office building

  • @boxfabio
    @boxfabio Před 2 lety +1

    if i had the money i would have a full rack with those warming my pool , ur welcome for the idea

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety

      I had a contractor looking at exactly that last week. Looks like it would generate too much heat for the 35K gallon pool so the question is whether it is worth scaling the project.

  • @2xKTfc
    @2xKTfc Před 2 lety +1

    Aren't many of the immersion fluids oil-based and generally flammable? Gotta beef up the Halon purging system if the entire facility becomes a hazmat storage -on top of the diesel tanks and can potentially leak from one floor to a lower one. I love the concept and am glad it's explored, but as you say, operationally, it seems slightly sketchy.

  • @ii_r_ftw
    @ii_r_ftw Před 2 lety +2

    well interesting that the rest of the industry is hopping onto the liquid cooling wagon that IBM has been doing off and on since the early 80's and immersion cooling that was shown off by a company company claiming to be SGI in 2014. The immersion fluids have come down in cost substantially in the last few years with the toxicity being less of an issue and cleaning not really being a problem depending on the fluid.

  • @BobHannent
    @BobHannent Před 2 lety +1

    I think the 100kW per tank sounds good, but when you consider the total footprint it probably struggles to compare.

  • @geokaisa
    @geokaisa Před 2 lety +1

    great..... in my future resume i'll be required to have a plumbers degree also..

  • @dougm275
    @dougm275 Před 2 lety +3

    If data centers went all liquid cooling, I wonder how the heat could be recycled.
    I toyed with the idea of water cooling my Opteron box way back, but I couldn't get any info on running consumer WC pumps 24/7. I did see a few commerical solutions for workstations, but they are very expensive.

    • @jfbeam
      @jfbeam Před 2 lety +2

      Look at things marketed for aquarium use. (esp. salt water tanks.) I've had fish tank pumps run for a decade.

    • @calaphos
      @calaphos Před 2 lety +2

      I know of some university HPC clusters that recycle the heat from their liquid cooled servers for heating. Liquid cooling is super common in HPC, use of the heat sadly not. The problem is that its fairly low temperature (35 degree water temperature), so hard to transport long range and most importantly just way to much. And its often just way to much power to effectively use, the DC there is build for ~5MW which is probably way more than the surrounding modern buildings need, even in the winter.

    • @dougm275
      @dougm275 Před 2 lety

      @@jfbeam True. I learned a bit more about the supply chain for this stuff and it seems like pumps that go into consumer water-cooling also go into a bunch of other things. So, no need to worry.

  • @JonnyDudemeister
    @JonnyDudemeister Před 2 lety +2

    I just hope that direct chip watercooling doesn't become a mess of propriatery standards. Open19 could do really well if they integrate watercooling in their rack standard.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety +1

      Rear door heat exchangers are used by a fairly large company that also utilizes Open19.

    • @JonnyDudemeister
      @JonnyDudemeister Před 2 lety

      @@ServeTheHomeVideo Interesting! I've always wondered how Open19 is doing in terms of industry adoption.

  • @rahulprasad2318
    @rahulprasad2318 Před 2 lety +1

    Nice.

  • @SakuraChan00
    @SakuraChan00 Před 2 lety +1

    OVH has been doing liquid cooling for years

  • @Veptis
    @Veptis Před rokem

    I want a high density workstation and I do see liquid cooling as the solution for me as well.

  • @user-th3jl8mz7y
    @user-th3jl8mz7y Před 2 lety +3

    OK take my money now, this is cool! lol

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety

      I see what you did there!

    • @user-th3jl8mz7y
      @user-th3jl8mz7y Před 2 lety

      But seriously, never heard of submerging and non-conductive liquids. Must be a nightmare to dry and service those pulled systems...
      Good luck with your studio move and stay safe.

  • @axiom1650
    @axiom1650 Před 2 lety

    The biggest difference between the GPUs is that the 80GB A100 does already perform 15-20% better than the 40GB A100 (both at 400W), due to the higher memory bandwidth. The 100W extra would result in 5-10% extra performance.

  • @BenState
    @BenState Před 2 lety

    TEC? Any work on improving those?

  • @DAVIDGREGORYKERR
    @DAVIDGREGORYKERR Před 2 lety

    That is not a problem at the Google Servers complex in North of Finland

  • @shadowmist1246
    @shadowmist1246 Před 2 lety +1

    Thanks for video. Immersion cooling is very effective but not practical. Direct to chip liquid cooling is the way to go. Unless you want to continue putting jet engines for fan cooling but these high performance servers are already too loud.

  • @shahboy68
    @shahboy68 Před 2 lety

    Just when you thought you were getting a handle on cable management

  • @gustavgustaffson9553
    @gustavgustaffson9553 Před 2 lety +2

    I once built a custom oil cooled pc, was a fun project but way too messy.

  • @jianmei3422
    @jianmei3422 Před 2 lety

    who is this coupler supplier? all plastic couplers?

  • @martylynchian8628
    @martylynchian8628 Před 14 dny

    No you were correct the first time, a couple of feet.

  • @naamloos1337
    @naamloos1337 Před 2 lety +2

    i want server motherboard just laying around in my kitchen aswel..

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety +2

      Actually that is where the new video editors will likely work. The office has two of those desks. Bland without motherboards and cameras!

  • @OVERKILL_PINBALL
    @OVERKILL_PINBALL Před 2 lety +1

    And we thought that mineral oil fish tank build 20 years ago would never go anywhere...

  • @joealtona2532
    @joealtona2532 Před 2 lety

    When it leaks the whole rack is doomed?

  • @abdullahseba4375
    @abdullahseba4375 Před 2 lety +1

    Great video as usual, but am I the only one that finds the back ground music annoying?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety

      There was a lot of background noise that the music is covering.

  • @freddybriffitt4734
    @freddybriffitt4734 Před 2 lety

    USystems now have rear doors capable of 200kW

  • @douglasborges3406
    @douglasborges3406 Před 2 lety +1

    What about alphacool solution to server cooling?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety

      These are a few levels above something like what Alphacool sells as standard offerings.

    • @douglasborges3406
      @douglasborges3406 Před 2 lety

      @@ServeTheHomeVideo Will you test someday these solutions?

  • @markarca6360
    @markarca6360 Před 2 lety

    I am waiting for LTT's comments here.

  • @martylynchian8628
    @martylynchian8628 Před 14 dny

    Water and computers, what could go wrong?

  • @RickJohnson
    @RickJohnson Před 2 lety +2

    Intriguing. Datacenters need to start looking beyond the monetary cost of water though - especially in the Phoenix area where nearly all water is "imported" one way or another.
    The growing trend of condensing more kW per Rack U is alarming as well, and is the elephant in the room. What are chipmakers and ODMs doing to address things like current leakage and overall power consumption (performance per watt)? It's great to condense your data center footprint, but it'd be just as great to condense the waste heat as well. One thought is what crypto miners have already figured out, and that's undervolting (within stable limits), which sometimes increases performance while reducing waste heat at the same time.
    IOW, reducing heat at the source may pay more dividends than designing better cooling - especially when reducing thermal output can reduce your cooling footprint. The question remains whether the tradeoffs are worthwhile (X more racks to offset any performance difference vs Y fewer cooling towers/capacity/kW of heat).

    • @fat_pigeon
      @fat_pigeon Před 2 lety

      ??? - Data center power efficiency has been the "hot" topic for like a decade or more. Electricity is a high fraction of DC costs so there's a very strong incentive to buy the most efficient parts. There's a reason that server CPUs are clocked lower than corresponding workstation/consumer parts, and older-generation enterprise hardware is so cheap on the used market.

    • @RickJohnson
      @RickJohnson Před 2 lety

      @@fat_pigeon Looking at 250-300W per socket, and having multiple sockets possible per rack-U, even more efficient cores and density are translating to more power dense racks w/ higher amounts of waste heat. IOW, we make more and more efficient parts, only to crank them up to the same tried and true peak thermal limits, often at the expense of wasted power for no appreciable gain (like over-enrichening an air-fuel ratio for an engine at some point reduces performance)
      Undervolting by around 100mV in many cases reduces component heat by around 10°C w/o changing performance or cooling solutions. The numbers vary by sample.
      My thought was, other than crypto miners, who else has attempted to optimize CPU/GPU voltage at scale w/ the goal of reducing heat output?

    • @fat_pigeon
      @fat_pigeon Před 2 lety

      @@RickJohnson My point is that that's true for desktop parts, and *only* desktop parts. (Well, also workstation and high-powered laptops). For those, it makes sense because they normally run interactive workloads at low duty cycle, so performance is paramount and efficiency under load is less important. But everyone else *already* cares greatly about power usage, and those parts are *already* clocked lower for efficiency. Servers? They pay a lot for electricity. Mobile? They want to maximize battery life. This is part of why people were so impressed with the Apple M1, since for some workloads it gives desktop-class performance with a TDP of roughly 15 W.

    • @DFX4509B
      @DFX4509B Před 10 měsíci

      Running the hot outlet water from the CDU through a secondary chiller to recycle it back into the loop is one way to solve the water problem, and then that heat from the chiller could be reclaimed for the building's hot water as its condenser coil could go into a hot water tank for that purpose.

  • @barbkennedy1587
    @barbkennedy1587 Před 2 lety +1

    Toxic fumes=burrito!

  • @twizz420
    @twizz420 Před 2 lety +1

    Should have also compared to Novec. It's not that expensive compared to the server itself.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety

      I tried to make this more general. Many different fluids being used in these.

  • @ewanmurray153
    @ewanmurray153 Před 2 lety

    I’m pretty sure water pumps consume less electricity than a massive climate control system required for keeping the ambient temps low for air cooled racks

  • @petermichaelgreen
    @petermichaelgreen Před 2 lety

    Getting 100kW per tank but taking up what looks like 4+ times as much floorspace for a tank compared to a rack doesn't seem like a huge win to me.

  • @bigappleplug6021
    @bigappleplug6021 Před 2 lety +1

    Dammit
    Liquid cooled servers
    That just rendered my tripp lite rack ac obsolete
    Edit: proceeds to snip the negative wire on server fans to push them to max CFM

  • @nonsuch9301
    @nonsuch9301 Před 2 lety +1

    Wasn't Supermicro the company found shipping motherboards with undisclosed chips sending encrypted data back to China and providing a back door into the servers ?

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety +11

      Every source in that Boomberg article said publicly (including Tim Cook at Apple and Andy Jassy of Amazon) that the story was false. I did the follow-up interview for their second article by directly interviewing Bloomberg's source: www.servethehome.com/yossi-appleboum-disagrees-bloomberg-is-positioning-his-research-against-supermicro/ and their source was not happy with how Bloomberg twisted what they said.
      Hardware security is important, but I maintain the SEC should investigate Bloomberg on that story to see if it was sponsored by a hedge fund.

    • @nonsuch9301
      @nonsuch9301 Před 2 lety

      @@ServeTheHomeVideo Good to hear, thanks for the clarification.

  • @giovannigio6217
    @giovannigio6217 Před 2 lety

    500W TDP per chip is a bit too extreme. aircooling is less efficient in terms of heat removal but when you use it, you are using less power hungry components that are indeed more efficient (flops per watt), if nvidia releases a gpu with 500w tdp, I think this is a step back towards a less efficient approach (Imagine PC overclocking, where the heat goes up much more than the gained performance, you need much more efficient cooling etc). conclusion: increasing the efficiency of the cooling solutions in this case means a decrease in efficiency in the hardware, we should balance better these two efficiency aspects (hw and cooling). maybe the solution is not releasing a 500W TDP in the first place and develop a better GPU in terms of silicon efficiency and architecture and then performing the same peformances in a 400W TDP gpu, even if this means waiting for a next gen product.

  • @samlebon9884
    @samlebon9884 Před 2 lety +1

    Now I have options to cool my hot girlfriend. Thanks.

  • @jamesowens7148
    @jamesowens7148 Před 2 lety +1

    Hey guys. I don't have any money.

  • @madmotorcyclist
    @madmotorcyclist Před 2 lety

    Unnecessary now that ARM servers coming online.

    • @fat_pigeon
      @fat_pigeon Před 2 lety

      Why? Server-class ARM CPUs have many cores so they have high TDPs as well.

  • @michaeltaylor5574
    @michaeltaylor5574 Před rokem

    YOu never showed anything you even needed to be there for. Bait and Switch for yapp

  • @bentheguru4986
    @bentheguru4986 Před 2 lety

    I think you are drifting too far from what the name was/is: Server The Home - Not much is "home" anymore and most high-end enterprise.

    • @ServeTheHomeVideo
      @ServeTheHomeVideo  Před 2 lety +2

      The flip side is that we have a large website and are the place that does all of this higher-end gear after over 12 years of evolving. Somewhat like how the Wall Street Journal reports on events outside of a street in NYC. www.servethehome.com/sth-turns-10/

    • @axiom1650
      @axiom1650 Před 2 lety

      STH is a great place though if you do interact with this kind of gear, probably more people than you'd think.

    • @slithery9291
      @slithery9291 Před 2 lety +1

      A quote from the website...
      'STH may say “home” in the title which confuses some. Home is actually a reference to a users /home/ directory in Linux. We scale from those looking to have in home or office servers all the way up to some of the largest hosting organizations in the world.'

  • @quintaeco
    @quintaeco Před 2 lety

    wholly sweet Jesus, 15 minutes in the video and no idea about what systems he is talking about!

  • @user-cx1ko1jn5i
    @user-cx1ko1jn5i Před 3 měsíci

    nonsense