The VisionPack AI Application Zoo is a collection of ready-to-run AI applications for NXP i.MX 8 platforms built using DeepView AI Middleware. Applications are focused on demonstrating various use cases of vision-based ML at the edge and can be run without any special setup. Developers can access the source code for the applications and adapt them using the VisionPack SDK.
This article provides a summary of the available applications in the zoo with each application having a link to further details about the application and links to download the binary and source code.
Make sure to also check out the ModelPack Dataset Zoo for example eIQ Portal projects which can train a ModelPack model for custom applications similar to the ones found here.
This application can run any detection model trained with eIQ Portal or from the VisionPack Model Zoo. It serves as a quick and easy way to run detection models with graphical overlays showing detections and performance metrics.
The Face Detection application showcases the simplicity of adding face detection to a project. Later applications, such as the Face Blur and Head Pose, use this face detection as a start to build practical applications.
This application shows a live video streaming remote camera which anonymizes faces to provide privacy. This allows crowd monitoring while alleviating privacy concerns.
This application detects faces and estimates their head pose. The head pose refers to the yaw, pitch, and roll of the head relative to the camera and has many interesting applications, for example the Robot Puppeteering application.
This application shows how real-time head pose estimation can be used to control a robot to mimic the person in front of the camera. This allows you to become a puppeteer by simply moving your head! Not everyone has a robot on hand to try out this demo so you can watch this video from the Calgary Maker Faire where we had a robot setup in our booth along with a Maivin AI Vision Starter Kit controlling the robot.
The Body Pose application demonstrates detection of key joints on a person and using this data to draw a skeleton overlay. The model runs at 30 FPS on the i.MX 8M Plus.
This application monitors a worksite for personnel with and without hard hats. This can be used to provide messaging to the people reminding them to wear their protective equipment to avoid injury.
The Vehicles and Pedestrians Detection application demonstrates ModelPack trained to detect various vehicles and pedestrians as found in the Berkeley Diverse Driving dataset. The model is run at 416x416 and maintains a consistent 60 FPS on the i.MX 8M Plus.
Detection and Tracking
This application builds on the Detection Application by extending it with the VisionPack Tracking extension. This gives objects a life beyond the frame where they were detected and allows developers to build interesting applications which act on an objects movement over time.
This application allows the user to define intersections on the screen which will monitor the flow of traffic. We provide examples which monitor cars flowing through a road intersection but the use case can be applied to other scenarios as well such as monitoring people in doorways or factory assembly lines. The demo allows the user to try different objects by simply providing an alternate detection model, train on your own data using ModelPack or use one of our pre-trained ModelPack Dataset Zoo models or VisionPack Model Zoo models; and of course you can always bring your own model.