We’re surrounded by artificial intelligence (AI). When you use the internet, take a photo, use predictive text, or watch TV, you are interacting with AI. And we are still in the early stages of this revolution in technology and our lives.
But AI can require large amounts of power. Researchers have documented the astounding amount of power required to train some modern AI algorithms. One could also argue that many of the ways people use AI—find the cat!—hardly qualify as essential services. Even if you restricted AI to the clearly beneficial applications like medical diagnostics, realizing the benefits still generates an insatiable demand for more AI compute cycles and more energy. So we’re back to square one: is it worth it?
In short, yes. AI—when implemented with foresight—will become one of our most important new tools for driving global sustainability. Here’s why:
1. AI is becoming more efficient.
Although the early days of AI were dominated by a race for functionality and associated exponential power demands, more recently focus has shifted to more efficient AI algorithms, optimization tools and specialized processors tuned for efficient AI, all resulting in tremendous energy efficiency gains.
Consider just the silicon portion. Last year, Arm released the Cortex-M55 CPU and Ethos-U55 microNPU for enhancing Machine Learning (ML) processing in IoT devices. The Cortex-M55 alone provides an ML performance uplift of 7x and a 6x increase in performance per watt. Together, however, the two processors combine for a 50x uplift in performance and a 25x uplift in efficiency. Arm of course is not alone: GPUs play a key role in processing AI efficiently in datacenters and multiple startups are developing processors for AI.
Call it the Performance-Perfection Cycle. Often when a new innovation emerges, companies set out on a race for better performance. Once functionality hits the ‘good enough’ level, the chase for efficiency begins. AI is currently rounding the pole.
2. AI tasks will shift from the cloud to local devices.
Running AI where it needs to happen—on the devices around us—reduces bandwidth, costs and energy. Consider a voice-activated coffeemaker. If the manufacturer chose to process its consumers’ voice commands in the cloud, it could cost $15 per year per appliance to provide support. With an average lifespan of 5 to 10 years, a voice-activated, cloud-powered coffee machine would become a loss leader in more ways than one. Local AI processing would come with almost no overhead.
AI inference will move to local devices. Complex, one-off ‘training’ will remain in the cloud for now, but over time we will see this also gravitate toward the edge and the devices around us.
3. AI has the potential to make almost anything that uses power more efficient.
Pumps consume an estimated 10 percent of the world’s electricity and 90 percent are inefficient, according to Grundfos. Realizing even half of the potential efficiency gains through AI-enabled control could cut global electricity consumption by 1 to 2 percent. Localized AI control is already becoming common in this area and will become more widespread over time.
Even things like building management systems that already benefit from digital controls are seeing improvements through AI and ML by being able to better predict, and react to, traffic patterns. Some estimate that digital technologies could cut emissions by 15 percent through real-world finessing. The bottom line: the world’s a big place, but it’s not particularly intelligent yet. Energy spent on efficient, and increasingly local AI, has the potential to achieve huge power savings.
4. AI will make the world more versatile.
AI is like a mushroom: it multiplies rapidly. Google used AI to reduced data center cooling by up to 30 percent with sources like weather data. Next, it experimented with time-shifting, i.e. moving less urgent tasks to times in the day when AI algorithms predict a greater (and cheaper) supply of wind and solar. Now it is looking at rolling applications from one data center to another to maximize renewables. The cumulative effect is significantly more efficient, more sustainable, more economical data centers.
5. Some of the parts are already in place.
Refrigerators are typically the second biggest consumers of electricity in your home, gobbling up 13% of the total. While regional efficiency regulations have led to incremental improvements, AI has the potential to make a major impact.
Arm partner Arcelik, a major appliance manufacturer in Turkey, ran a thought experiment to see if adding AI to existing refrigerators—with their small memory and processor footprints and fixed compressor run times—could help. It developed a lightweight Reinforcement Learning (RL) algorithm that analyzed local, in-home behavior—not reams of training data—to see if it could minimize temperature fluctuations that occur when people open and close the door. The smaller the fluctuations, the less power consumed by the compressor. It found such a system could reduce power by an estimated 10 percent in existing fridges. Hypothetically, you could shut down 9 entire power plants with widespread deployment in Europe alone.
But let’s keep going. Approximately 1 billion smart meters have been installed worldwide. Imagine using meters to allow appliances to negotiate with utilities for peak power conservation discounts. Peak power plants cost hundreds of millions of dollars, get used only a few days a year, and can have a higher emission profile than other plants. Potentially huge gains can be delivered with what’s already there.
Achieving distributed intelligence will take work. Algorithm and processing efficiency, security, data management, and data governance are all issues. We will also see innovation in as-a-service models with equipment makers adding optimization and predictive maintenance services to equipment and appliance sales.
Still, the effort will be worth it. The unique power of AI will allow us to take on some of the world’s biggest challenges and, just as important, the experiences and technology that we will gather during the next decade will allow us to do so efficiently.
This blog previously appeared in Semiconductor Engineering.