Get Updates
Get notified of breaking news, exclusive insights, and must-see stories!

The Paradox of AI: Enlightened Machines, Resources Strained

Artificial intelligence (AI) has become ubiquitous. It is slowly becoming indispensable in many sectors given the ease of doing things. But behind the shiny surface lies a hidden energy guzzling monster. In the blink of an eye these Large language models (LLMs) can generate captivating images, videoes, produce analytical research and answer the most difficult questions. But this technological marvel requires a colossal amount of energy.

AI CONVERSATIONS BURN BRIGHTER THAN A SMALL TOWN

The Paradox of AI Enlightened Machines

A recent study has unearthed one of the most inconvenient truth which the technology giants rarely share: a single conversation with ChatGPT, developed by OpenAI, uses the daily electricity allowance of 17,000 average U.S households. The popular chatbot uses about half a million kilowatt-hours daily, that is enough to power a small town for a day. According to a report in the New Yorker, a single conversation with ChatGPT consumes roughly the same amount of energy as leaving eight 60W bulbs on for an hour!

Traditionally big technology giants have been using huge amounts of power to process and store data. In 2021, Google used 12.4 terawatt-hours, Microsoft 10.8 terawatt-hours, and Amazon 13.6 terawatt-hours of energy. To put this into perspective, the city of Mumbai, with a population of over 21 million, consumed around 24 terawatt-hours of electricity in 2019-20. This means that Amazon alone consumed more than half the energy consumed by the entire city of Mumbai in a year.

AI'S 'UNQUENCHABLE' THIRST FOR WATER

The environmental impact of AI extends beyond just electricity consumption. Building and training these LLMs requires analysing massive amounts of data, generating immense heat that needs constant cooling. These data centers rely heavily on water-based cooling systems, pumping vast quantities of water to keep them cool. Microsoft, for instance, disclosed a staggering 34% increase in its global water consumption from 2021 to 2022, reaching nearly 1.7 billion gallons. That is enough to fill over 2,500 Olympic-sized swimming pools.

The energy and water consumption of AI pose a significant threat to our environment. Fossil fuel reliance for powering data centers contributes to greenhouse gas emissions, accelerating climate change.The global market for AI technologies is vast, amounting to around 200 billion U.S. dollars in 2023 and is expected to grow to over 1.8 trillion U.S. dollars by 2030. It is safe to assume that the energy and the water usage would also rise exponentially.

BALANCING INNOVATION AND ENVIRONMENTAL RESPONSIBILITIES

The need of the hour for these tech giants is to divert their attention towards using more sustainable means. A study by Microsoft found that underwater data centers are practical and can be even cost effective. Announcing the results of a study in 2018, Microsoft said the underwater data center had just one-eighth the failure rate of a land-based data center, a dramatic improvement. Researchers and tech companies are actively exploring ways to make AI more energy-efficient. The future of AI hinges on striking a balance between innovation and sustainability, ensuring that technological advancements don't come at the cost of our planet's well-being.

Notifications
Settings
Clear Notifications
Notifications
Use the toggle to switch on notifications
  • Block for 8 hours
  • Block for 12 hours
  • Block for 24 hours
  • Don't block
Gender
Select your Gender
  • Male
  • Female
  • Others
Age
Select your Age Range
  • Under 18
  • 18 to 25
  • 26 to 35
  • 36 to 45
  • 45 to 55
  • 55+