Join ipXchange and Edge Impulse for a practical, engineer-focused live session that walks through the process of implementing edge AI on real hardware platforms, ready to implement in your own designs.
In this live webinar we will work through the full Edge Impulse workflow end to end using a simple but realistic example, building an image classifier that can tell the difference between everyday objects like mugs and pens. You will be able to follow along in Edge Impulse yourself and test the same mug or pen classifier as we do, using only your browser. By the end, you will have seen the complete Edge Impulse pipeline in practice and have a working example you can reuse and adapt to your own projects.
What we will cover
Using a straightforward “mug vs pen vs neither” example, we will walk through the real workflow you follow when building an ML model with Edge Impulse:
- Planning sensible image classes for a small but realistic application
- Uploading data to Edge Impulse and creating an Impulse for image classification
- Generating features and choosing model parameters that fit your target hardware
- Training the model live, then reading accuracy, confusion matrices, and feature plots
- Spotting when the data is the real problem and how to fix it without guesswork
- Deploying the trained model to an Arduino Nicla Vision using OpenMV firmware, or to your phone
- Translating the same steps to other use cases such as gesture, audio, or sensor data
This webinar is presented by
Jim Bruges
Edge Impulse
Senior Engineer
Elliot Lee-Hearn
ipXchange
Electronics Engineer & Host
Free