Apply for the development board now

How to slash power consumption for AI vision

AI vision at the endpoint is a challenge, but even more challenging than finding the processing capability is getting a low enough power consumption for battery-operated devices. What if you could split and simplify your AI workload with an Alif Ensemble MCU?

In ipXchange’s next interview from Hardware Pioneers Max 2024, Eamon chats with Jerome from Alif Semiconductor for a great demonstration of how Alif’s dual-core Ensemble MCU can be used to split AI workloads into smaller chunks to save power in real-world applications.

For those not in the know, Alif’s MCUs are unique in their use of Cortex-M55 cores in conjunction with Arm’s Ethos-U55 microNPU (Neural Processing Unit). Working together, these processing architectures can be used to reduce the power consumption of AI vision inferencing by 76x that of an M55 core alone. Better yet, the Ensemble MCUs can do this 78x faster as well, so Alif’s microcontrollers are great for endpoint AI in battery-operated designs.

It is likely that this success has led to the recent release of Alif’s E1C Ensemble MCU, which takes the Cortex-M55 + Ethos-U55 capabilities of the original E1 device – with 46 GOPS (Giga-Operations Per Second) of on-chip AI/ML processing – and shrinks it into a 3.9 x 3.9-mm package for the most space-constrained designs.

But what can you do with two M55 + U55 pairings in the same device?

That’s what was on show at Hardware Pioneers Max! As Jerome explains, this number plate recognition demo is running on Alif’s E3 Ensemble MCU, which features one Cortex-M55 core running at up to 160 MHz and supported by an Ethos-U55 with 128 MAC/c processing capability, and a second turbo-charged pairing where the Cortex-M55 core can run at up to 400 MHz and the Ethos-U55 is specified at 256 MAC/c processing capability.

So how does this apply to AI-based number plate recognition? The first core simply detects the presence of a number plate, and if such a number plate is detected, the second core can run an AI vision algorithm to read it.

By splitting the AI vision task into two separate workloads running on the two cores, you can dramatically save the amount of on-device processing required as you are only reading numbers and letters once a plate is detected, rather than constantly trying to detect these objects in the frame.

Alif’s Ensemble devices are truly a marvel of what can be achieved using a microcontroller for low-power endpoint AI vision tasks, something that some are still using power-hungry GPUs for. You can learn more about Alif’s AL/ML AppKit development board by following the link to the board page below. There, you can apply to evaluate this technology through ipXchange, and Alif can also supply the code to run such a demo so that you can try it out for yourself.

Keep designing!

Intrigued by Alif’s AI-ready MCUs? Check out ipXchange previous chats with this disruptive company:

How to run vision AI on batteries with Alif MCUs

Balletto adds Matter-ready wireless to Alif MCUs

Make home appliances understand what you are saying! Alif Semiconductor’s AI/ML AppKit at CES

Secure AI vision in a stamp-sized board!

Which Alif AI/ML dev kit is right for you?

Alif Semiconductor AI/ML AppKit For Ensemble MCUs & Fusion Processors

Have an AI vision task that you want to try splitting in two to save power?

Apply for the development board now
Get industry related news

Sign up for our newsletter and get news about the latest development boards direct to your inbox.

We care about the protection of your data. Read our Privacy Policy.