Tag Archives: gesture accuracy measures

ICMI 2013 talk posted, and toolkit released!

Last week at the ICMI 2013 conference in Sydney, Australia, I presented work done in collaboration with my co-authors Radu-Daniel Vatavu and Jacob O. Wobbrock on new ways of understanding how users make stroke gestures (for example, with stylus and finger on touchscreen devices), through the use of 12 “Relative Accuracy Measures for Stroke Gestures” that our paper introduced. The paper has details on the measures themselves and how they are derived; the talk focuses on what these types of measures can be used for and how they can help us design and build better gesture interactions. For those interested, my presentation slides are available here.

We have also released an open-source toolkit, which we call “GREAT” (Gesture RElative Accuracy Toolkit) that you can use to compute the measures on your own dataset. Download it here!

Leave a comment

Filed under Talk / Presentation

Paper on relative gesture accuracy measures accepted to ICMI 2013!

The $-family of gesture recognizers project has been working on more ways to characterize patterns in how users make gestures, in the form of new gesture accuracy measures (see our GI 2013 paper for the first set of measures we developed). This new set focuses on relative accuracy, or degree to which two gestures match locally rather than just simply global absolutes. Our paper introducing these measures has been accepted to ICMI 2013! My co-authors are Radu-Daniel Vatavu and Jacob O. Wobbrock. Here is the abstract:

Current measures of stroke gesture articulation lack descriptive power because they only capture absolute characteristics about the gesture as a whole, not fine-grained features that reveal subtleties about the gesture articulation path. We present a set of twelve new relative accuracy measures for stroke gesture articulation that characterize the geometric, kinematic, and articulation accuracy of single and multistroke gestures. To compute the accuracy measures, we introduce the concept of a gesture task axis. We evaluate our measures on five public datasets comprising 38,245 samples from107 participants, about which we make new discoveries; e.g., gestures articulated at fast speed are shorter in path length than slow or medium-speed gestures, but their path lengths vary the most, a finding that helps understand recognition performance. This
work will enable a better understanding of users’ stroke gesture articulation behavior, ultimately leading to better gesture set designs and more accurate recognizers.

I’ll be at ICMI 2013 in Sydney, Australia, in December (summer down under!) to present the paper. Come ask me about the details! In the meantime, check out the camera-ready version of our paper here.

Leave a comment

Filed under Publication