The advantages of on-device AI are real, but not realistic without a better battery

Enovix
4 min readJul 26, 2023

--

By Dr. Raj Talluri, President & CEO of Enovix

While on-device AI opens up exciting possibilities to make our devices more personal, secure and productive, it demands an incredible amount of processing power. We ran tests of our own that demonstrated up to 50% more battery runtime is required for AI to run effectively.

Have you ever wondered how the camera in your smartphone automatically frames a face and adjusts to the light conditions around you? Or how photo and video reels are automatically created that pull you back into the vacation you took three years ago? Even though Artificial Intelligence (“AI”) is already all around us and in our mobile devices, its usage will only grow as more applications are developed that bring us convenience and productivity.

On-device AI Advantages

As our devices become smarter and more personalized, keeping the majority of the AI workload on the device has multiple advantages:

· Cost Efficiency: Limiting or eliminating cloud server costs and pay-per-use cloud models provides savings both at the individual and enterprise-level.

· Speed and Efficiency: With on-device AI, our devices don’t have to ping a cloud server or cell tower. Devices can perform tasks like video editing, for example, and make decisions in real time without relying on an internet connection, translating to faster response times and an improved user experience.

· Accessibility: Being able to use a device when there’s no internet connectivity provides huge advantages and conveniences from productivity to safety.

· Privacy and Security: Biometric identification can be processed locally on a device and doesn’t send data to the cloud, which can enhance privacy and reduce the risk of unauthorized access to personal information.

· Reduced Latency & Bandwidth: With more and more people pinging a given network, the user experience can dwindle, but with on-device AI, data processing occurs locally. For example, I like to create and edit Instagram reels on my device, something that wasn’t possible just a few years ago.

· Sustainability: By eliminating continuous data transitions from the cloud, we can lower carbon emissions and create a more energy efficient compute process.

On-device AI Applications are Battery Intensive

On-device AI imposes substantial demands on the GPU, CPU and memory to enable the neural network in our devices to learn and process at breakneck speed. We ran some of our own tests to find out how much battery it takes to run AI image processing apps on the edge and here’s what we found:

Adaptive AI

Adaptive AI tools such as Adobe’s new denoise filter can require 30x the battery power over regular noise filters.

Adaptive AI analyzes data and makes real-time, dynamic adjustments and improves over time (e.g. personalized customer experiences). The advantages are productivity, efficiency, capability, accessibility, privacy/security, reduced latency to name a few.

For our in-house experiment, we ran Adobe Lightroom’s new AI-powered Denoise filter vs. a standard noise filter. Running locally on a new Asus laptop computer equipped with an AMD Ryzen 7 5800HS, 8-core processor and Nvidia GeForce RTX 3060 Graphics Card, the laptop battery lasted only 59 minutes and rendered 172 images until shut-down.

The AI Denoise filter required 30x more battery. Once consumers experience the power and convenience of these adaptive AI features, they won’t accept devices without them.

Generative AI

Gerative AI tools like Stable Diffusion can demand nearly 50x the battery power when run on-device vs. in the cloud.

Our second experiment was running generative AI, which is a form of Machine Learning that can produce text, video and images from text prompts (e.g. ChatGPT). We used the same laptop and ran Stable Diffusion, a deep learning text-to-image application, to generate images until the battery ran out. We then ran the final number of images in the cloud to compare run time.

Running locally the laptop lasted 68 minutes before the battery shutdown. Generating images in the cloud, which offers limited features, the laptop was able to last 11 hours before battery exhaustion. Consumers and developers will insist on real-time generative AI tools being run locally on their devices to avoid restrictive pay-per-use models and latency. In order to enable these features on-device, 50x more battery is required.

On-device AI provides a host of benefits from cost, accessibility, flexibility, privacy, and reduced latency. I believe AI will continue to make our mobile devices more useful and personal, but clearly, we need a better battery. It takes an incredible amount of processing power to sort through data. The CPU, GPU, display, and memory consume huge amounts of power already, and the requirements for new AI engines will only intensify those demands.

As consumers, our expectations for on-device capabilities also will continue to grow. We want more pixel-rich displays, the ability to stream an entire season of Foundation in 4K and we don’t want to lose productivity when there’s no internet connection. But, with every new demand, battery life continues to be a major issue for consumers and a problem for consumer electronics OEMs.

At Enovix, our goal is to provide the best battery solution for our customers so they can offer next-gen devices that support the demands of consumers. Our batteries have an increased capacity of up to double compared to many batteries in leading commercial consumer devices in the market today. Just in time for the future.

--

--

Enovix

Enovix Corporation (Nasdaq: ENVX) is an advanced silicon battery company with locations in Fremont, CA; Penang, Malaysia; Hyderabad, India.