Unveiling the Energy Impacts of Machine Learning and AI.

We all know about Machine learning and Artificial Intelligence right? But do we research about that properly? I think many of us will answer negative. Cause we don’t have enough time to do so.

While working on some project and reading various research papers, it’s evident that researchers from different universities, companies like Google Deepmind, Microsoft Azure, Meta Researcher team, Amazon Web Service are building production level applications which we are using regularly in our daily lives. We understand that they are creating something helpful for us and we are using continuously. However, do we truly understand the potential impacts they may have on us? No.

Let me share something I’ve explored. When we try to build A Machine Learning Model or deep learning model for a production level project, we must require a high configured PC, otherwise our project won’t function properly or may be it happens system crash. Because when we work on a Machine Learning project, we firstly apply algorithms based on project requirements. After that we apply an important method which is one of most important feature to add to build a better model with best accuracy named Hyperparameter Tuning. Normally we use 8-15 parameters to update, but if we have more than 10-20 lakhs rows and more than that we need to apply more and more parameters to work. Then it requires more and more energy from the Device in which we are working on. If you are an tech enthusiast, you may know about the components of PC or devices, for these reason it’s impacting on our environment by doing on electricity and energy.

According to one survey, the average US household uses around 29 kw-h daily. Dividing the amount of electricity that like Chatgpt, Midjourney, OpenAi playground, Azure AI, Amazon Sagemaker and many more uses more than 17 thousand times the amount of electricity. Can you imagine? If you need to consume that amount of electricity, you will be required that type of device to take load otherwise your system will be crashed.

if Google integrated generative AI technology into every search, it would drain about 29 billion kilowatt-hours a year, according to calculations made by Alex de Vries, a data scientist for the Dutch National Bank, in a paper for the sustainable energy journal Joule. That’s more electricity than countries like Kenya, Guatemala, and Croatia consume in a year, according to The New Yorker.

De Vries estimated in a paper that by 2027 the entire AI sector will consume between 85 to 134 terawatt-hours annually. “You’re talking about AI electricity consumption potentially being half a percent of global electricity consumption by 2027,” de Vries told The Verge. “I think that’s a pretty significant number.”

Leave a Reply

Your email address will not be published. Required fields are marked *