The AI revolution is underway, with chatbots leading the way. ChatGPT has become a staple in many programmers’ arsenals, while Google Bard ensures users can expect helpful answers at the top of their Google searches. AI chatbots are now commonplace and on the verge of becoming inevitable.
Meta, Facebook’s parent company, is set to unveil its AI experiences, which include a variety of AI-powered chatbots. Unlike ChatGPT, which is hosted on a dedicated website, Meta’s AI can be accessed through any Meta app. These apps include WhatsApp, the most popular texting app, along with Facebook and Instagram.
How Sustainable Is ChatGPT
Since its release, ChatGPT has faced significant criticism. The interaction of generative AI with copyright laws remains unclear, adding to concerns about potential misuse – such as the instance when ChatGPT provided instructions on creating a bomb to a journalist.
A less-explored yet equally perilous aspect of ChatGPT is its sustainability, or rather, its lack thereof. Despite uncertainties surrounding the platform’s carbon footprint, estimates suggest a troubling scenario. According to a study by Stanford University, the training of GPT‑3’s database resulted in the generation of 500 metric tons of carbon dioxide.
The environmental impact of GPT‑4 is still uncertain, but it’s reasonable to assume it will surpass its predecessor. To surpass its previous version, GPT‑4 will likely need to expand its database beyond its current record of 175 billion parameters. Without advancements in energy efficiency and optimized data storage, ChatGPT’s environmental footprint is poised to increase.
ChatGPT and Water Consumption
The GPT servers run hot, as servers inevitably do. But since the power of large language models comes from their incredible computing power, AIs run especially hot. Estimates say that GPT‑3’s training consumed about 700,000 liters of water. Certainly a large sum, but how does it compare to other industries?
Training a language model like GPT-3 consumes enough water to fill a nuclear reactor, or enough to produce over 300 cars, a notoriously high-polluting process. In simpler terms, for every five questions answered by ChatGPT, the servers consume 500 ml of water.
That might not seem like much, but ChatGPT’s active users are estimated at over one hundred million. While OpenAI hasn’t published any data on the number of daily queries it receives, low estimates are still in the hundreds of thousands.
Related Articles: What is The World’s Most Advanced AI Companion Capable Of? | The Race Against AI | Meta Faces Fine for ‘Forcing’ Users to Consent to Personal Data Use in Ad Practices |
How Sustainable Are Meta AI Experiences
In comparison to ChatGPT and Google’s Gopher, Meta exhibits a relatively modest carbon footprint of 70 metric tons of carbon dioxide. Notably, all these models share similar database sizes.
While database size significantly influences an AI’s carbon footprint, it is also the factor that contributes to the software’s intelligence. The key to a sustainable AI future appears to lie in reducing its environmental impact, as demonstrated by Meta’s efforts.
However, unlike ChatGPT, which has been in use for a longer duration, obtaining a clear understanding of Meta’s sustainability practices proves challenging. This lack of transparency extends to other AIs as well. Although experts have estimated Google Bard’s water consumption in the hundreds of millions of liters, precise data remains elusive.
Can ChatGPT and Meta AI Become More Sustainable?
One of the difficulties in calculating the sustainability of generative AIs is that researchers don’t have enough data available. As noted by Stanford University, private industries are far ahead of academia and governments.
It’s no surprise if experts and researchers overwhelmingly have a cautious and skeptical view of AI.
If we’re just scaling without any regard to the environmental impacts, we can get ourselves into a situation where we are doing more harm than good with machine learning models.
But how can those technologies scale up while keeping sustainability in mind?
One solution, highlighted by researcher Peter Henderson, is to use smaller models when possible. And as with anything else, running those models on green energy is sure to diminish their environmental impact.
More than anything else, the researchers say, we must recognize this as an overwhelming complex problem, requiring multiple solutions by different actors.
No single company or government will solve this problem. We must pool together. Each company needs to find its superpower, based on unique context and influence.