• LainTrain@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      1 month ago

      Nah I don’t think anyone is implying that massive megacorps draining small global south towns of water is a good thing actually apart from the boogeymen in your head. AI is good and does solve problems, but only when it’s an open source model you run yourself for a particular purpose. I know it may be shocking but data centres existed just the same before corpo-AI and they were problematic for the climate all the same then. Please touch grass and rid yourself of the degrowth internet brainrot.

  • davel@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 month ago

    The only way I can imagine a data center “consuming” water is by evaporative cooling. Querétaro is dry country, where this should work well, so that makes sense.

    • WalnutLum@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      Most data centers evaporative cooling from what I understand, and according to This

      Cooling towers use water evaporation to reject heat from the data center causing losses approximately equal to the latent heat of vaporization for water, along with some additional losses for drift and blowdown. In larger data centers this on site water consumption can be significant, with data centers that have 15 MW of IT capacity consuming between 80-130 million gallons annually. n this study, on-site water consumption is estimated at 1.8 liters (0.46 gallons) per kWh of total data center site energy use for all data centers except for closet and room data centers, which are assumed to use direct expansion (air-cooled chillers).

      And seeing as hyperscale data centers usually use between 20-50 megawatts per data center, and there’s three of them in Colon, that’s like at least 240 million gallons of water a year.

      Yikes.