Artificial Intelligence Is Enormously Energy-Intensive and Not Very Sustainable

A hand reaching to touch the screen of a laptop displaying icons of a microchip, a power plant, a roblotic arm, and others linked together to simulate the Internet of Things.
Nowadays, thousands of companies are focusing on the incorporation of artificial intelligence (AI) and machine learning in their operations. (Image: Wrightstudio via Dreamstime)

Nowadays, thousands of companies are focusing on the incorporation of artificial intelligence (AI) and machine learning in their operations. Artificial intelligence is no longer an obscure tech buzzword loved by seasoned geeks! It is already being used by businesses from various niches, including some corporate giants and tech icons.

AI-powered bots handle customer queries in mobile apps while manufacturing is being automated with the deployment of AI-powered tools. The first generation of AI-powered devices and home appliances are already available in the market and those with deep pockets can buy them with ease. However, when it comes to artificial intelligence, not everything is all roses and sunshine.

To use AI applications and run AI-powered machinery, a significant amount of energy has to be generated. In other words, using artificial intelligence can enhance the carbon footprint of the planet. This does not sound too good at a time when scientists are concerned about carbon emissions and their long-term effect on climate.

Artificial intelligence uses massive amounts of energy

Recently, a team of researchers in San Francisco used artificial intelligence to control a robotic hand that manipulated Rubik’s Cube pieces through trial and error. This required them to harness the computing power equal to that of 1,000 desktop computers along with 12 machines with specialized graphics chips. There is no denying that AI technology is evolving quickly and tools and robotic apparatus utilizing artificial intelligence are achieving feats worth applauding. However, these advances require a mammoth amount of electricity and computing power.

Using artificial intelligence to control a robotic hand that manipulated Rubik's Cube required computing power equal to that of 1,000 desktop computers plus a dozen machines with specialized graphics chips.
Using artificial intelligence to control a robotic hand that manipulated Rubik’s Cube required computing power equal to that of 1,000 desktop computers plus a dozen machines with specialized graphics chips. (Image: Mohd Hafiez Mohd Razali via Dreamstime)

Artificial intelligence experts are aware of the issue. Mila AI research institute’s postdoctoral researcher Sasha Luccioni says: “The concern is that machine-learning algorithms, in general, are consuming more and more energy, using more data, and training for longer and longer.” (Wired).

The good thing is that a section of AI researchers is taking steps to bring down the carbon footprint. They are making use of specialized tools to track the energy usage of their chosen algorithms. Luccioni recently developed a website to help AI researchers get a rough estimate of the carbon footprint of their selected algorithms.

The truth is that the energy required to execute cutting-edge artificial intelligence has been rising for some time. Luccioni thinks it is not only about making AI researchers aware of the issue. The corporate and tech giants deploying different AI applications and tools also need to perform reality checks. The computing power needed for running major AI landmarks has doubled almost every 3.4 months. Between 2012 and 2018, it has shot up 300,000 times.

To train a robust machine-learning algorithm, researchers have to run huge banks of computers. The tweaking needed for polishing an algorithm is computationally intensive. It is hard to assess the energy used by such AI applications. Worldwide, data centers use 200 terawatt-hours of energy each year. A study conducted by Google and University of California scientists offers some insights into the actual carbon footprint of such systems. The GPT-3, developed by OpenAI, a San Francisco-based AI research and development company, generated 552 metric tons of CO2 during its training. That is what 120 passenger cars generate per year. This will go up if more countries use AI applications aggressively in the future.

Interior view of a data center with equipment.
Worldwide, data centers use 200 terawatt-hours of energy each year. (Image: Gregory21 via Dreamstime)

The tech giants known for using artificial intelligence intensively are taking steps to cut down on energy usage. Google says its data centers have zero net carbon emissions as it uses renewable energy. Microsoft declared a plan to become carbon negative within the next 8 years. The Allen Institute for AI, set up by Paul Allen, a late Microsoft co-founder, is taking steps to raise and spread awareness of the environmental impact of AI tech. Oren Etzioni, its CEO, says the efforts of the lead researchers are quite encouraging in this regard. 

The carbon footprint of using AI algorithms varies based on three factors. These are the nature of electricity generation, the type of computer hardware used, and the design of the algorithm. All of these can be tweaked to bring down the carbon footprint. Changes in data centers to improve their efficiency can bring good results as well. This view is supported by noted Google scientists and U.C. Berkeley emeritus professor David Patterson.

Follow us on TwitterFacebook, or Pinterest 

RECOMMENDATIONS FOR YOU