Tag Archives: $-family

C# implementation of $P recognizer available, online demo in JavaScript!

We have recently made available a reference implementation of our $P recognizer in C#, which you can find on the $P project page. This version augments our original online demo and implementation in JavaScript, both still available as well. When you download the C# .zip file, you receive (1) a DLL of just the recognizer which you can use in your C# applications, (2) a canvas drawing and recognizing demo in C# equivalent to the online JavaScript demo, and (3) a “How To” document explaining how to incorporate these versions in your own projects. Try it out and let us know how it goes!

A reminder: if you implement $P in a new language or in a new way, feel free to let us know and we will link to it from our page as well! Don’t forget to cite us!

Leave a comment

Filed under Software / Data

New $N multistroke gesture set released!

I am pleased to announce that the Mixed Multistroke Gestures (MMG) dataset from our GI 2012 paper is now publicly available for download! It contains samples from 20 people who entered each of 16 gesture types 10 times, using either their finger or a stylus on a Tablet PC, at three different speeds (slow, medium, fast), for a total of 9600 samples. The samples are stored in the $N Recognizer‘s data format, and each person’s samples are separated into user-speed sub-folders. See more details on the gestures, the users who entered them, and $N’s accuracy in recognizing them in our GI 2012 paper. You may download the dataset here. If you use it in your work, please cite us!

Leave a comment

Filed under Software / Data

GI 2012 talk posted!

Last week I presented our work on enhancing our $N multistroke recognizer by integrating Yang Li‘s Protractor matching method at GI 2012. The presentation was more algorithm-focused than most of the HCI talks in the conference (and most of my other work), but still generated a few interesting questions. The slides for the talk are posted here. Stay tuned for future enhancements to the $-family of recognizers!

Leave a comment

Filed under Talk / Presentation

Paper accepted to Graphics Interface 2012 as a Note!

My colleague Jacob O. Wobbrock and I have had a paper accepted to the Graphics Interface 2012 conference! We extend our $N multistroke recognizer to use the closed-form matching method of Yang Li‘s Protractor, speeding up the matching process significantly. The paper is titled “$N-Protractor: A Fast and Accurate Multistroke Recognizer,” and here is the abstract:

Prior work introduced $N, a simple multistroke gesture recognizer based on template matching, intended to be easy to port to new platforms for rapid prototyping, and derived from the unistroke $1 recognizer. $N uses an iterative search method to find the optimal angular alignment between two gesture templates, like $1 before it. Since then, Protractor has been introduced, a unistroke pen and finger gesture recognition algorithm also based on template-matching and $1, but using a closed-form template-matching method instead of an iterative search method, considerably improving recognition speed over $1. This paper presents work to streamline $N with Protractor by using Protractor’s closed-form matching approach, and demonstrates that similar speed benefits occur for multistroke gestures from datasets from multiple domains. We find that the Protractor enhancements are over 91% faster than the original $N, and negligibly less accurate (<0.2%). We also discuss the impact that the number of templates, the input speed, and input method (e.g., pen vs. finger) have on recognition accuracy, and examine the most confusable gestures.

Check out the camera ready version here. For the pseudocode of this method, see the $N project website.

Leave a comment

Filed under Publication