Ever wanted to perform high-level AI/ML inferencing within a microcontroller-based, battery-operated design? Alif Semiconductor has a whole family of products that will enable you to do that. And they’ve got the kits to make development so much easier than you’d expect!
In ipXchange’s second interview with Alif Semiconductor at Embedded World 2024. Guy chats with Mark for an update about the Ensemble family of MCUs and fusion processors in the wake of Alif’s new Balletto wireless MCU family.
As Mark explains, the Ensemble family chips are interchangeable for different processing requirements based on how many Cortex-M55 and Cortex-A32 cores you might need to run your application – they all have the same footprint on your PCB.
That said, it is the Ethos-U55 microNPUs that enable AI/ML tasks like facial recognition, audio recognition and enhancement, sensor fusion, and natural language processing to run on battery-powered end products, thanks to the ultra-high efficiency, high-speed AI inferencing.
In fact, Alif has seen up to 100x improvements in inference time when running AI vision applications on the Ensemble MCUs when compared to the previous generation of Cortex-M-class cores.
Mark’s recommended chip for really getting the idea of what’s possible with an Alif MCU is the dual-core E3 device. This features asymmetric – in terms of maximum operating frequency – Cortex-M55, Ethos-U55 pairings so that one side of the device can operate at all times and only wake up the other side for more serious AI processing workloads.
This enables you to build products like smart glasses and security cameras with much higher operating efficiency – i.e. longer battery life – since the heavy-duty computing is only enabled when required.
But the most exciting thing about any video from Embedded World is the live demos, and Mark had plenty to show us:
- Natural language voice commands for a microwave oven
- Simultaneous facial detection at up to 100 frames per second
- High-end graphics
- Up-to-20-person body detection at 63 frames per second
All these demos are possible with Alif’s AI/ML AppKit, which features the top-tier Ensemble device, the E7. By using the accompanying software to select it to behave like other Ensemble devices, however, you can use this development board to build your application based on the functionality of whichever Ensemble chip you want to use.
Alif supplies a great many application demos that you can load onto these kits, which feature microphones, a display, and more to support whatever AI/ML-enabled build you can imagine. Best of all, you can perform many of these tasks on battery power for AI that you can truly take with you on the go, without the requirement for cloud-based processing.
What more is there to say? Apply to evaluate this technology through ipXchange by following the link to the board page below. We’ve linked directly to the AI/ML AppKit, but there are links on that page to learn more about the Ensemble devices themselves if you so wish.
Keep designing!
Love a good discussion with Alif? Check out our previous video from Embedded World 2024, some longer demo videos from CES 2024, and a thorough discussion of all Alif’s development kits that are currently available:
Balletto adds Matter-ready wireless to Alif MCUs
Make home appliances understand what you are saying! Alif Semiconductor’s AI/ML AppKit at CES