FaceFollower

FaceFollower

While home from university I found my LEGO MindStorm set and decided to build something with it. Recently having been on quite a few video calls, and wanting to build something useful, I came up with the idea to build a webcam that keeps my face in frame. A FaceFollower.

Demo

Facial Detection

I started with the facial detection software, since I thought that would be the most difficult part.

I started by following a tutorial on how to use Azure Computer Vision. It seemed simple and easy to use while still being powerful, I mean, a Microsoft server center has to have a bit more power than my desktop PC. But I immediately hit problems, with the free version I could only make a call to the API every third second. And it’s not like I’m not going to pay for anything.

But I continued and wrote it to completion, and then noticed exactly how slow it actually was. So, I decide that while facial detection from Azure was very powerful, it’s not suited for live detection of this kind. I mean, I could get a person’s emotions from that API but all I wanted was a box around their face.

So, I needed a new approach. I had heard of OpenCV before and began to look for a way to utilize it from C#, and found Emgu CV a. NET wrapper of OpenCV.

With Emgu CV and some facial detection files from OpenCV’s GitHub page I could create a facial detection program that could detect my face in near real time. It had much less accuracy than the Azure implementation (not detecting my face in all kinds of lighting), but with the increased speed it was far superior for my purpose.

Robotic Camera Mount

When designing the LEGO robot, I wanted to put the motors away from where the action was happening. The first version with the motors below the turning-tower had a problem where when I tilted it, it would also turn. This was because the gear attached to the tilted tower that turned the tower “moved” relative to its counterpart on the base.

To remedy this, without trying to turn the motors in sync, I used a differential gear (LEGO piece 6573) That allowed for one motor to rotate both sets of gears while the other only could rotate one of them. But another problem arose, both sets of gears didn’t turn at the same rate.

To solve that I had two custom gears 3D-printed that were supposed to have the right ratio to make them turn at the right speed. But my “calculations”, more like a qualified guess, were wrong and the different gear sets still turned at different speeds.

So, I gave up on trying to separate the motors from the “action” and built the final design. Now the motor that turns the tower sits below it, and the one that tilts the tower sits on it.

Although the base and tower have gone through many iterations, the camera holder have only received small changes to fit with the base and is largely the same as when I first built it.

Controlling the Robot

Before starting with anything else I searched the internet for a library to control my MindStorm robot with, and found that AForge.NET had just that functionality.

Controlling the MindStorm robot quickly became a trial and error process. The options I could set for turning the motors were not nearly as precise as when programming from the official LEGO software. Instead of specifying the number of degrees to turn I could only set the number of “tacho’s” to turn. Sometimes “tacho” seemed to be analogous with degrees, but sometimes, telling the motor to turn 180 “tacho’s” would make it turn two complete turns.

After having built the first version of the robot I settled for a simpler way of controlling it. When the facial detection detected that my face was near the edge, I would set the robot to turn an arbitrary number of “tacho’s”. When the facia detection either lost the face (hopefully because of blur and not that it’s outside the camera) or saw the face in the middle of camera, I would send a stop command to the robot. That way I didn’t have to calculate how many degrees to turn, only for the robot to not follow my orders.

Source

The source code and LEGO designs are available on GitHub: https://github.com/89netraM/FacecFollower.