• Sl00k@programming.dev
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 days ago

    I see the “AI is using up massive amounts of water” being proclaimed everywhere lately, however I do not understand it, do you have a source?

    My understanding is this probably stems from people misunderstanding data center cooling systems. Most of these systems are closed loop so everything will be reused. It makes no sense to “burn off” water for cooling.

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      2 days ago

      data centers are mainly air-cooled, and two innovations contribute to the water waste.

      the first one was “free cooling”, where instead of using a heat exchanger loop you just blow (filtered) outside air directly over the servers and out again, meaning you don’t have to “get rid” of waste heat, you just blow it right out.

      the second one was increasing the moisture content of the air on the way in with what is basically giant carburettors in the air stream. the wetter the air, the more heat it can take from the servers.

      so basically we now have data centers designed like cloud machines.

      Edit: Also, apparently the water they use becomes contaminated and they use mainly potable water. here’s a paper on it

      • Aceticon@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        2 days ago

        Also the energy for those datacenters has to come from somewhere and non-renewable options (gas, oil, nuclear generation) also use a lot of water as part of the generation process itself (they all relly using the fuel to generate the steam to power turbines which generate the electricity) and for cooling.

        • lime!@feddit.nu
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          steam that runs turbines tends to be recirculated. that’s already in the paper.