- AI chatbots and videos use a huge amount of energy and water
- A video AI of five seconds uses as much energy as a microwave operating for an hour or more
- The energy consumption of the data center has doubled since 2017 and the AI will explain half of the computer by 2028
It only takes a few minutes in a microwave to explode a potato that you have not ventilated, but it takes as much energy as running this microwave for more than an hour and more than a dozen explosions of potatoes for a model of AI to make a video of five seconds of potato explosion.
A new study by MIT Technology Review has established how the hungry AI models are for energy. A basic chatbot response can use as little as 114 or up to 6,700 joules, between half a second and eight seconds, in a standard microwave, but it is when things become multimodal that energy costs soar at one hour more in the microwave, or 3.4 million joules.
It is not a new revelation that AI is with a high intensity of energy, but the work of the MIT exposes mathematics in sudden terms. The researchers have designed what could be a typical session with an AI chatbot, where you ask 15 questions, ask for 10 images generated by the AI and throw requests for three different videos of five seconds.
You can see a realistic fantastic film scene that seems to be filmed in your backyard one minute after asking for it, but you will not notice the huge amount of electricity you have asked to produce it. You asked about 2.9 kilowatt hours, three and a half hours of microwave.
What means that AI’s costs are distinguished is how painless it is from the user’s point of view. You are not budgeted with AI messages as we all did with our text messages 20 years ago.
Ai Energy Rethink
Of course, you do not exploit Bitcoin, and your video has at least one real value, but it is a very low bar to pass with regard to ethical energy consumption. The increase in energy requests from data centers also occurs at a ridiculous pace.
The data centers had reached a tray in their energy consumption before the recent IA explosion, thanks to the efficiency gains. However, the energy consumed by data centers has doubled since 2017, and about half of it will be for AI by 2028, according to the report.
It is not a guilt of guilt, by the way. I can claim professional requests for part of my use of AI, but I used it for all kinds of recreational pleasure and to help with personal tasks. I will write a note of apologies to people working in data centers, but I would need AI to translate them for the language spoken in certain locations of data centers. And I don’t want to seem heated, or at least not as heated as these same servers. Some of the largest data centers use millions of water gallons a day to stay freezing.
The developers behind the IA infrastructure understand what is going on. Some are trying to find cleaner energy options. Microsoft seeks to conclude an agreement with nuclear power plants. The AI may or may not be an integral part of our future, but I would like this future to be full of boiling extensions and rivers.
On an individual level, your use or avoidance of AI will not make much difference, but to encourage better energy solutions of the owners of the data center. The most optimistic result is to develop more energy -efficient fleas, better cooling systems and greener energy sources. And maybe the AI carbon footprint should be discussed like any other energy infrastructure, such as transport or food systems. If we are ready to debate the sustainability of almond milk, we can surely save a reflection on the 3.4 million joules necessary to make a five -second video of a dancing cartoon almond.
While tools like Chatgpt, Gemini and Claude become smarter, faster and more anchored in our lives, pressure on energy infrastructure will only grow. If this growth occurs without planning, we will try to cool a supercomputer with a paper fan while we chew a raw potato.