Artificial intelligence promises to make a lot of things easier beyond helping people draft emails, automate tedious business processes and extract deeper insights from complex datasets. Like streaming and many other digital technologies, however, AI is power-hungry. What’s helpful for enterprises might therefore be harmful for the environment.
AI was environmentally problematic even before ChatGPT. In 2022, for instance, global electricity demand from AI, including data centers and cryptocurrencies, was 460 TWh, according to Mikhail Dunaev, chief AI officer at Comply Control, which develops AI solutions to assist companies with regulatory compliance. That figure is expected to nearly double to 800 TWh by 2026.
“This is an increase of 75%, equivalent to the amount of power it would take to support an entire country the size of France for a whole year,” Dunaev told The Forecast.
Environmental responsibility is a top-line concern for many businesses: Gartner reports that 67% of CEOs view sustainability as a leading business growth opportunity, and 76% of enterprises say technology can play a critical role in reducing their carbon emissions, according to Ernst & Young.
With companies striving to improve sustainability, AI’s energy consumption creates not only “significant costs,” but also “a substantial global carbon footprint,” according to Dunaev.
“To address these challenges, more efficient methods of deploying AI are necessary,” he said.
There are ways in which technology itself can help reduce AI’s carbon impact. When it comes to processing data in support of artificial intelligence, on-device AI — also known as edge AI — offers an especially promising way forward.
On-Device AI vs. Cloud AI
Performing AI computations at the edge — that is, on devices themselves — can help minimize power-hungry data transmission over networks, according to experts at Advantech, an Internet of Things (IoT) solutions provider.
“Edge devices are designed to be more energy-efficient and sustainable compared to their cloud counterparts,” the company explains on its website. “This approach also leads to an improved carbon footprint for edge devices by reducing the number of I/O operations required by cloud AI applications.”
AI researchers agree.
“Edge computing provides notable energy efficiency benefits compared to large data centers by reducing data transport, lowering latency, and enabling dynamic resource allocation,” a group of those researchers reported in the April 2024 edition of the journal Internet of Things. Compared to cloud, “the decentralized nature of edge computing allows for localized renewable energy integration, improving overall sustainability,” the researchers noted.
Computations at the edge become increasingly important as sensors, cameras, drones, and other devices generate scads of data in the field.
“Edge computing plays a critical role … by enabling the development of sustainable and energy-efficient AI systems,” Jaime Vélez, a product marketing specialist for AI solutions provider Barbara, wrote in May 2023 in a post on the company’s blog.
“Edge computing can reduce the energy consumption of AI systems by enabling data processing and storage to occur closer to the source of the data,” Vélez continued. “This reduces the need for high-performance computing infrastructure, leading to a significant reduction in energy consumption and carbon emissions.”
While cloud computing and edge AI both have roles to play in bringing AI to life, it’s clear that the ability to process AI data at the edge can yield significant environmental benefits.
Edge AI in Action
To see what edge AI looks like in practice, consider Dryad Networks, whose wildfire-detection technology uses AI to “smell” fires when they’re in the smoldering stage as a means of improving firefighters’ rapid response.
“Our enterprise customers — railroad and powerline operators — are deploying thousands or hundreds of thousands of sensors along their infrastructure,” Dryad Networks CEO Carsten Brinkschulte told The Forecast, emphasizing the importance of edge AI processing to the company’s sustainability goals.
“We have gone to the extreme with regards to edge processing, and are executing the AI engine in the sensors in a distributed mesh network.”
By processing its data at the edge instead of sending it back to the cloud, Dryad can lower the carbon impact of its AI algorithms. The company reduces emissions even further by using solar energy to power its edge devices.
A number of other use cases highlight the potential for AI processing at the edge:
Amazon Go stores pair edge AI with IoT devices to create a cashier-less shopping experience, technology observer Om Ghogare reported in a May 2024 post on Medium.
A manufacturer in the automotive industry wanted a complete camera-based solution for improved driver and passenger safety and security. It used AI at the edge to inform the system in support of real-time responsiveness, AI enabler Darwin Edge shared in a company case study.
An oil-industry operator leveraged edge AI to capture in real time the behavior of pumps in the field, reducing pumps’ average failure rate by 55%, energy-industry solutions provider SLB shared in a company case study.
Because they reduce the need to transfer large amounts of data to centralized data centers, all these edge-AI deployments are inherently more sustainable, driving down carbon emissions by using less energy than conventional processing strategies.
Looking Ahead
While sending data to and from the cloud is a major driver of AI’s carbon impact, there’s another factor at play: training AI models in the first place, which takes a massive amount of energy.
On conventional computing infrastructure, “the emissions from training just one AI model can be as high as 626,000 pounds of CO2 equivalent — about five times the lifetime emissions of an average American car,” the Harvard Business Review reported in 2023.
Likewise, researchers found that the training of GPT-3 — a 175 billion-parameter language model — consumed 1,287 megawatt-hours of energy that resulted in a carbon footprint of roughly 552 metric tons of CO2-equivalent emissions.
With more than three-quarters of Americans (78%) saying that a sustainable lifestyle is important to them, according to McKinsey — and some 80% of businesses already pursuing sustainability goals, according to Forbes — it’s clear that environmental stewardship is a fixture instead of a fad.
On-device training of AI algorithms offers another promising path forward for enterprises that want to embrace new technology while also meeting their sustainability goals. For example, an on-device training solution in development at MIT “minimizes computational overhead and boosts the speed of the fine-tuning process” by running key AI-training operations at the edge, researchers report.
By shifting the AI computational load to edge devices, organizations can pare back power consumption in both the transport and processing of data. As a result, they can simultaneously seize on the potential benefits of AI and ease its negative environmental impacts.
Adam Stone writes from Annapolis, Maryland, on the intersection of technology and business needs.
© 2024 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.