ChatGPT’s resource needs soaring out of control

  • mayaskmayask
  • AI
  • September 21, 2024
  • 0 Comments


“panumas nikhomkhai / Pexels”
It’s no secret that the growth of generative AI has necessitated an ever-increasing amount of water and electricity. However, a new study by The Washington Post and researchers from the University of California, Riverside has unveiled just how many resources OpenAI’s chatbot requires to carry out even its most fundamental functions.
In terms of water consumption, the quantity needed for ChatGPT to compose a 100-word email hinges on the state and the proximity of the user to OpenAI’s nearest data center. The scarcer water is in a particular region and the less expensive the electricity, the more likely the data center is to rely on electrically powered air conditioning units instead. For instance, in Texas, the chatbot only consumes an estimated 235 milliliters to generate one 100-word email. On the other hand, in Washington, the same email would demand 1,408 milliliters (nearly a liter and a half) per email.
Data centers have expanded and become more densely populated with the rise of generative AI technology, to the extent that air-based cooling systems struggle to keep up. That’s why many AI data centers have switched to liquid-cooling systems that pump a vast amount of water past the server stacks to extract thermal energy and then discharge it to a cooling tower where the heat is dissipated.
ChatGPT’s electrical requirements are also no small matter. According to The Washington Post, using ChatGPT to write that 100-word email draws enough current to power over a dozen LED lightbulbs for an hour. If even a tenth of Americans used ChatGPT to write that email once a week for a year, the process would consume the same amount of power that every single household in Washington, D.C., does in 20 days. D.C. is home to approximately 670,000 people.
This is not an issue that will be easily resolved any time soon, and it is likely to get even worse before it improves. For example, Meta needed 22 million liters of water to train its latest Llama 3.1 models. Court records show that Google’s data centers in The Dalles, Oregon, consumed nearly a quarter of all the water available in the town. Meanwhile, xAI’s new Memphis supercluster is already demanding 150MW of electricity from the local utility, Memphis Light, Gas and Water, enough to power as many as 30,000 homes.

  • mayask

    Related Posts

    ChatGPT’s new Canvas feature like Claude’s Artifacts vividly

    img { max-width: 100%; } OpenAI Following closely on the heels of its whopping $6.6 billion funding round, OpenAI on Thursday made the beta of a brand-new collaboration interface for…

    OpenAI raises $6.6B in latest funding round

    Andrew Martonik / Digital Trends OpenAI has now emerged as one of the wealthiest private companies on Earth after successfully securing a whopping $6.6 billion in its latest funding round…

    You Missed

    New Avatar: The Last Airbender game looks super ambitious

    • By mvayask
    • October 5, 2024
    • 39 views

    PS5 colorful chrome accessories pre-order now

    • By mvayask
    • October 5, 2024
    • 38 views
    PS5 colorful chrome accessories pre-order now

    ChatGPT’s new Canvas feature like Claude’s Artifacts vividly

    • By mayask
    • October 5, 2024
    • 40 views
    ChatGPT’s new Canvas feature like Claude’s Artifacts vividly

    OpenAI raises $6.6B in latest funding round

    • By mayask
    • October 5, 2024
    • 45 views
    OpenAI raises $6.6B in latest funding round

    Qualcomm aims to add cool AI tools to Android phone

    • By mayask
    • October 5, 2024
    • 39 views
    Qualcomm aims to add cool AI tools to Android phone

    Reddit in $60M deal with Google for AI tools boost

    • By mayask
    • October 5, 2024
    • 38 views