Most of us don't think much about the electricity usage of our digital devices and the work we do on them, but every keystroke or internet search consumes power. Until now, most tasks we accomplish with our phones and computers have been quite economical in terms of power consumption. However, that is changing with the introduction of generative AI.
Generative AI, which is covered more in depth in this article, is a relatively new kind of software application that can create digital content such as a text and imagery. To create these powerful systems requires a lot of electricity, and using them is also more energy intensive than more traditional forms of computing.
The first step in creating sophisticated software systems is training. During this process, large quantities of data are analyzed to improve the model's ability to detect and predict patterns. The latest models require data sets as large as significant portions of the internet. Processing this much information needs thousands of the world's most powerful chips working together in huge data centers. It is estimated that training the AI model for the first version of ChatGPT, launched in November 2022, used roughly the same amount of electricity as 130 average U.S. households consume in a year.
To push these systems to improve, companies are increasing both the size of the training datasets and the amount of computation used to process the data. Each new model is much larger and requires exponentially more computing power to create.
Moreover, the energy consumption doesn't stop at the training phase. Once deployed, these models continue to consume significant amounts of electricity to generate responses, perform tasks, or maintain availability for user queries. The operational energy usage of these AI systems can be substantial, especially as they scale to meet growing demands.
For example, a typical search engine query requires about .3 watt-hours of electricity, while a generative AI prompt requires roughly ten times as much power to produce an output. To put this into perspective, consider a common household appliance like a refrigerator. A typical refrigerator uses about 1 to 2 kilowatt-hours (kWh) of electricity per day. Let's take an average value of 1.5 kWh for our calculations.
An online search uses about 0.3 watt-hours (Wh) of electricity.
A generative AI prompt uses about 3 Wh of electricity.
Therefore, a day of refrigerator use is equivalent to approximately 5000 online searches or 500 generative AI prompts.
While a single generative AI prompt's electricity use seems minuscule, when multiplied by billions of users and countless daily interactions, the cumulative effect becomes significant. For instance, if a billion people each made just one generative AI query per day, the total electricity consumption would be equivalent to running over 2 million refrigerators continuously.
One way to approach this question is to look at the energy consumption of data centers, massive warehouses containing the actual hardware that runs our modern digital infrastructure. A typical data center is a large facility designed to house a vast array of computer servers, storage devices, and networking equipment. These centers are engineered to provide a controlled environment that ensures optimal performance and reliability of the hardware. Key features of data centers include:
Data centers are critical hubs of our digital world, consuming vast amounts of electricity to support the ever-growing demands of AI and other digital services. As of now, there are thousands of data centers spread across the globe, with major hubs in the United States, Europe, and Asia. These data centers range from small server rooms to massive facilities covering millions of square feet.
According to industry reports, the number of data centers is expected to grow substantially in the coming years. This growth is driven by the escalating demand for cloud services, data storage, and computational power required to support emerging technologies like generative AI. Estimates suggest that the global data center market could expand by several hundred new facilities annually, each consuming large amounts of electricity and contributing to the overall carbon footprint.
In 2021, data centers accounted for between 0.9% and 1.3% of global energy consumption, a number expected to grow to 1.86% by 2030. Many leading data center operators have committed to powering their operations entirely from renewable sources within a few years. This is an important effort given our rapidly increasing reliance on them as a critical part of our infrastructure.
While generative AI systems currently represent only a small fraction of data center workloads, they require significant computing power and electricity. Given the likelihood that generative AI features will be integrated into many common digital resources, it is probable that future data center creation and energy consumption will largely be driven by the development and deployment of increasingly large and powerful generative AI systems.
Like many things in life — it depends. Imagine a teacher needs to create a multiple-choice exam for a chapter her students just covered. Using a generative AI tool, the teacher can create the quiz in seconds and then spend a few minutes reviewing it to ensure everything is accurate and appropriate.
Generative AI is an excellent choice in this situation because the teacher can verify the accuracy of the information. The quiz format is simple for the AI to generate, and the teacher saves time by not having to come up with questions and incorrect answers. What might have taken several hours now takes just a few minutes.
Despite generative AI being an energy-intensive form of computing, using it in the right situations can be quite beneficial. According to a study in Nature, using generative AI to create text or images can produce 130 to 2,900 times less CO2 than humans doing the same tasks. If that same teacher were to sit in the office for hours writing a quiz with the lights and air conditioner running the whole time, it could be argued that using AI is a better and more environmentally sound option.
The inverse is also possible. Let’s say you need to write a thank you note and you spend several minutes asking an AI model for dozens of different versions, none of which you end up using. In this case, a significant amount of resources were expended for little value, and it may have been more effective and environmentally friendly to simply sit down with a pen, think for a moment, and write something from the heart.
Just as there's no definite rule for deciding whether to walk or drive somewhere, there's no clear way to know when to use generative AI. The most important thing is to consider the energy these tools consume and to develop a sense of when using them is both efficient and effective.
While generative AI offers remarkable capabilities and efficiencies, it also comes with substantial environmental costs. The electricity required to train and operate these advanced models is significant, leading to increased carbon emissions and energy consumption. Data centers, which are the backbone of modern AI infrastructure, contribute to a growing carbon footprint that necessitates the adoption of more sustainable practices. As generative AI continues to integrate into various digital resources, it is crucial to balance its benefits with its environmental impact.
However, the potential for generative AI to optimize tasks and reduce human labor cannot be overlooked. When used judiciously, generative AI can be a powerful tool for saving time and resources. The key lies in discerning the appropriate contexts for its use, ensuring that its deployment maximizes efficiency and minimizes unnecessary energy expenditure. As we move forward, it will be essential to develop best practices for utilizing generative AI in ways that are both environmentally and operationally sustainable.