This application shows how real-time head pose estimation can be used to control a robot to mimic the person in front of the camera. This allows you to become a puppeteer by simply moving your head! Not everyone has a robot on hand to try out this demo so you can watch the video below from the Calgary Maker Faire where we had a robot setup in our booth along with a Maivin AI Vision Starter Kit controlling the robot.
- Maivin AI Vision Starter Kit
- A robot...
- RS-485 connection between the Maivin and the robot for real-time control.
- Copy the docker-compose.yaml file to the target using SCP.
- SSH into the target
- Bring down a previous docker session, if it exists.
- Use the --remove-orphans option if all previous docker components were not taken down.
- Load Docker Compose
The docker will automatically load on start-up. The unit may require a reboot to ensure the camera pipeline starts properly.
View the website at http://verdin-imx8mp-xxxxxx