In ipXchange’s second video with Henrik Flodell at CES 2024, Guy gets a demonstration of Alif Semiconductor’s AI/ML AppKit and some artificial intelligence solutions created by Alif’s partners using this versatile development board.
The first is a natural language library by Sensory, which allows for voice control for all the features of a home appliance, for example. The key benefit of Alif and Sensory’s approach is that there is no internet connection required for this extensive AI functionality, despite it being able to understand complex sentences, rather than just keywords – all processing is done within Alif’s Ensemble device. In this example, the board has been trained for a microwave oven, and it is able to understand what to do to “defrost one pound of chicken”.
This is very difficult to do on a traditional microcontroller, but Alif’s devices are built from the ground up to be well suited for such tasks, thanks to the integrated neural processing unit (NPU), which enables faster and larger-vocabulary voice recognition functionality than previous MCU architectures.
The second solution is from Plumerai, a company that makes off-the-shelf AI solutions for various applications, in this case for presence and object detection using a camera. With Alif’s device, the inference speed is running at 80 fps and 150 fps respectively, much faster than the 10 fps typical of previous solutions, and with lower power consumption.
If you’re building a product and want to add some high-level AI functionality while keeping all data on the device, Alif Semiconductor has you covered, so follow the link below to apply to evaluate their Ensemble devices using the AI/ML AppKit.
Keep designing!