A recent study warns that the artificial intelligence (AI) industry could potentially consume an amount of energy equivalent to that of a country the size of the Netherlands by the year 2027.
Big tech companies have been quick to integrate AI-powered services, especially since the emergence of ChatGPT last year. These services, however, demand significantly more energy than traditional applications, making online activities more energy-intensive. Nevertheless, the environmental impact of AI might not be as dire as feared if its current expansion slows down.
It’s important to note that research on this topic is speculative due to the limited data disclosure by tech firms, making precise predictions challenging. Nevertheless, it’s evident that AI necessitates more powerful hardware compared to conventional computing tasks.
A recent study by Alex De Vries, a PhD candidate at the VU Amsterdam School of Business and Economics, is based on certain parameters remaining constant, such as AI’s growth rate, the availability of AI chips, and servers continuously operating at full capacity. The study suggests that Nvidia, a chip designer, supplies around 95% of the AI processing equipment used in the industry. By estimating the number of these computers to be delivered by 2027, De Vries approximated that AI’s energy consumption could range from 85 to 134 terawatt-hours (TWh) of electricity annually, with the higher end being roughly equivalent to the annual energy consumption of a small country, such as the Netherlands.
De Vries emphasized the need for judicious use of AI, reserving it for situations where it is genuinely necessary. His peer-reviewed study has been published in the journal Joule.
What is the energy and water consumption of artificial intelligence?
AI systems like the extensive language models that drive widely-used chatbots, such as OpenAI’s ChatGPT and Google’s Bard, demand specialized computer facilities known as data centers to function. This results in increased energy consumption, and similar to conventional hardware, these systems also necessitate cooling, which can involve water-intensive processes.
- A straightforward manual to assist you in comprehending artificial intelligence.
The study did not encompass the energy required for cooling. Many prominent tech companies do not quantify this specific energy consumption or water usage. Mr. de Vries and others are advocating for increased transparency in the industry regarding these aspects.
Nonetheless, it’s undeniable that the demand for the computers powering AI is rapidly increasing, along with the energy required to cool those server facilities.
Danny Quinn, the CEO of the Scottish data center company DataVita, noted a significant rise in inquiries about using his facility for AI equipment in 2023. He also highlighted the energy contrast between a rack housing regular servers and one hosting AI processors.
“A standard rack filled with conventional equipment uses around 4 kilowatts (kW) of power, akin to a typical family home. In contrast, a rack hosting AI equipment consumes about 20 times that, roughly 80kW of power. And a single data center could accommodate hundreds, if not thousands, of these.”
Within DataVita’s Fortis data center located in Scotland’s central belt.
He mentioned that Scotland’s cooler and wetter climate offers a natural advantage in assisting data centers with temperature control, but it remains a substantial challenge.
In its most recent sustainability report, Microsoft, a company heavily investing in AI development, disclosed a 34% increase in its water consumption between 2021 and 2022, reaching 6.4 million cubic meters, approximately equivalent to 2,500 Olympic swimming pools.
Professor Kate Crawford, the author of a book on AI’s environmental impact, expressed her concerns about this issue. Speaking to the BBC in July, she stated, “These energy-intensive systems require massive amounts of electricity and energy, as well as significant quantities of water to cool these immense AI supercomputers. So, we are truly witnessing an extensive extractive industry for the 21st Century.”
- Exploring the Actual Expenses of AI in the Tech World.
However, there are also aspirations that AI could potentially contribute to resolving some of the environmental issues confronting our planet.
AI technology is being employed in attempts to minimize the quantity of condensation trails, or vapor trails, created by airplanes in the sky.
Recently, Google and American Airlines conducted research that revealed pilots could potentially reduce the creation of contrails (vapor trails) by up to 50% by utilizing an experimental AI tool to optimize flight altitudes. These contrails are recognized for their contribution to global warming.
Moreover, the U.S. government is allocating substantial funds towards the pursuit of nuclear fusion, a process akin to how the Sun generates energy. Success in this endeavor could revolutionize our energy supply, offering a virtually unlimited, environmentally friendly power source. AI has the potential to accelerate this research, which has been ongoing since the 1960s with slow progress.
In February of this year, university academic Brian Spears claimed to have utilized AI to predict an outcome in an experiment that resulted in a significant breakthrough. He stated, “For 100 trillionths of a second, we produced ten petawatts of power. It was the brightest thing in the solar system.”