An example of how to integrate the Gesture Recognition Toolkit into an iPhone app
An example of how to integrate the Gesture Recognition Toolkit into an
iPhone app
The Gesture Recognition Toolkit is a “cross-platform, open-source, C++ machine learning library
designed for real-time gesture recognition”. This repository contains the project outlined in my blog posts, Integrating the GRT into an
iPhone app and Machine-Learning powered Gesture Recognition on iOS.
Here are some of the gestures I was able to train the system to
recognize:
SwiftR is used for visualizing
the acellerometer data.
Since I worked on this project, there’s been a lot of advancemenrts in Apple’s CoreML framework, including the ability to create Motion Activity Classifiers using CreateML. If you’re looking to create a gestural recognition system for iOS from scratch, I would recommend taking a look at the linked Apple talk.
Additional resources: