Group Abstract Group Abstract

Message Boards Message Boards

[WSC18] Phone Gesture Recognition with Machine Learning

3 Replies

enter image description here - Congratulations! This post is now a Staff Pick as distinguished by a badge on your profile! Thank you, keep it coming!

POSTED BY: EDITORIAL BOARD

Thanks for taking the time to look at my Wolfram Summer Camp project - I hope you enjoyed it.

A few notes:

  1. The device I'm using here is an Android phone - I did develop a platform-agnostic Channel[]-based method for sensor data collection detailed in Part 1 (http://community.wolfram.com/groups/-/m/t/1386358), though, and there is some documentation for doing a similar thing with iDevices.
  2. When you refer to 'interpreting gestures', are you referring to accelerometer-based / air gestures or touch gestures (Googling 'gesture recognition yields many results for the latter and few for the former)? If Android / Apple have released a library for accelerometer-based gestures, do post the link here - I'd be interested to see it. (n.b. I'm using accelerometer sensors, not GPS)
  3. Your sound wave based suggestion sounds quite interesting - I might take a look at Mathematica's sound processing capabilities (e.g. this seems interesting for determining when a gesture starts/stops: https://www.wolfram.com/language/11/computational-audio/transient-detection.html).
  4. I'm not completely certain, however, as to the viability of a non-'computer learning' / machine learning approach to waveform classification - even if I converted the data to sound waves, as I understand it I would still have to use some machine learning model to classify the waveforms (time series classification --> e.g. hidden Markov models, or recurrent neural networks, both of which would also work with the raw accelerometer data simply presented as a time series). Again, if there's something the Wolfram Language has (built-in or otherwise) that could help with this, please do let me know.

Kind regards,

Euan

POSTED BY: Euan Ong
Anonymous User
Anonymous User
Posted 7 years ago

this is very neat, and analysis using Wolfram can make it scientific (although the iPhone is not a scientific instrument)

however: as far as interpreting gestures: the iphone has that built-in, the methods are patented, and Apple allows using the libraries for detecting gestures free by the public (likewise, android i think released free code for their lib)

if your goal was iphone gesture, i believe it's already done and patented but available to use - realtime recognition

but downloading raw data and re-recognizing it, nothing wrong with that

HOWEVER: as far as mathematica learning being slow and inaccurate. sound recognition is a MAJOR job: mathematica has a large section for Sound that's good (your gps signal it still applies though your wave form is not in that form). Sound cards have special recognition components that do things CPU cannot keep up with. the best waveform analyzers cost tens of thousands.

i'm unsure if you want quick waveform detection if you should be using the "computer learning" feature. perhaps you should convert it to soundwave form and check your options that way

POSTED BY: Anonymous User
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard