Gesture recognition success!

This week we managed to get the program to recognize different gestures for different actions. We used the training data provided on GitHub and modified it to make it work. With gesture recognition, we started to connect the image input to the mouse/touchpoint behavior. We have got something on it so far, but we still need to test and debug.

On the other hand, we started looking for hardware to mount the Kinect on the touch table. At the moment, we have several different solutions, which still need to be measured and evaluated to come to a final scheme.

We have also started working on a test plan, which we expect to provide to the committee for review on QRB2.

Gesture Recognition Example

We’ll build our first prototype in a few weeks, stay tuned!

Leave a Reply

Your email address will not be published. Required fields are marked *