Monthly Archives: August 2012

Paper on new $P recognizer accepted to ICMI 2012!

Co-authors Radu-Daniel Vatavu and Jacob O. Wobbrock and I have had a paper accepted to ICMI 2012, titled “Gestures as Point Clouds: A $P Recognizer for User Interface Prototypes,” in which we introduce $P, the latest member of the $-family of gesture recognizers. $P can handle multistroke and unistroke gestures alike with high accuracy, and remedies the main limitations of $N in terms of cost to store and match against all possible multistroke permutations.

Here is the abstract:

Rapid prototyping of gesture interaction for emerging touch platforms requires that developers have access to fast, simple, and accurate gesture recognition approaches. The $-family of recognizers ($1, $N) addresses this need, but the current most advanced of these, $N-Protractor, has signi ficant memory and execution costs due to its combinatoric gesture representation approach. We present $P, a new member of the $-family, that remedies this limitation by considering gestures as clouds of points. $P performs similarly to $1 on unistrokes and is superior to $N on multistrokes. Speci fically, $P delivers >99% accuracy in user-dependent testing with 5+ training samples per gesture type and stays above 99% for user-independent tests when using data from 10 participants. We provide a pseudocode listing of $P to assist developers in porting it to their speci fic platform and a “cheat sheet” to aid developers in selecting the best member of the $-family for their speci fic application needs.

You can find the camera-ready version of the paper here. Try out $P online in your browser here!

Leave a comment

Filed under Publication

New $N multistroke gesture set released!

I am pleased to announce that the Mixed Multistroke Gestures (MMG) dataset from our GI 2012 paper is now publicly available for download! It contains samples from 20 people who entered each of 16 gesture types 10 times, using either their finger or a stylus on a Tablet PC, at three different speeds (slow, medium, fast), for a total of 9600 samples. The samples are stored in the $N Recognizer‘s data format, and each person’s samples are separated into user-speed sub-folders. See more details on the gestures, the users who entered them, and $N’s accuracy in recognizing them in our GI 2012 paper. You may download the dataset here. If you use it in your work, please cite us!

Leave a comment

Filed under Software / Data