The $-family of gesture recognizers project has been working on more ways to characterize patterns in how users make gestures, in the form of new gesture accuracy measures (see our GI 2013 paper for the first set of measures we developed). This new set focuses on relative accuracy, or degree to which two gestures match locally rather than just simply global absolutes. Our paper introducing these measures has been accepted to ICMI 2013! My co-authors are Radu-Daniel Vatavu and Jacob O. Wobbrock. Here is the abstract:
Current measures of stroke gesture articulation lack descriptive power because they only capture absolute characteristics about the gesture as a whole, not fine-grained features that reveal subtleties about the gesture articulation path. We present a set of twelve new relative accuracy measures for stroke gesture articulation that characterize the geometric, kinematic, and articulation accuracy of single and multistroke gestures. To compute the accuracy measures, we introduce the concept of a gesture task axis. We evaluate our measures on five public datasets comprising 38,245 samples from107 participants, about which we make new discoveries; e.g., gestures articulated at fast speed are shorter in path length than slow or medium-speed gestures, but their path lengths vary the most, a finding that helps understand recognition performance. This
work will enable a better understanding of users’ stroke gesture articulation behavior, ultimately leading to better gesture set designs and more accurate recognizers.
I’ll be at ICMI 2013 in Sydney, Australia, in December (summer down under!) to present the paper. Come ask me about the details! In the meantime, check out the camera-ready version of our paper here.
This past week the Interaction Design for Children (IDC) conference was held in New York City! It was a great conference on lots of cutting edge research and design work being done in the area of children interacting with technology, ranging from exercise games to educational applications to kids with special needs. There were also great demos of some of the exciting projects.
I presented a paper on the MTAGIC project’s findings related to the impact of visual feedback on gesture interaction for kids. For those interested, check out the slides. UMBC PhD student Germaine Irwin presented the MTAGIC project’s poster on the use of gamification elements to encourage children to stay focused during empirical studies. She did a great job on the madness-talk (only 15 seconds!) and discussing the poster with interested people.
Next year IDC 2014 will be held in Aarhus, Denmark, home of LEGO!
I’m pleased to announce that the MTAGIC project has had a poster accepted to the Interaction Design and Children (IDC) 2013 conference coming up this month in New York City! This poster paper was led by UMBC Human-Centered Computing (HCC) PhD student Robin Brewer. While doing an independent study with me, Robin investigated ways of motivating young children (ages 5 to 7 years old) to complete activities during empirical studies. Her initial explorations showed that this age group found the tasks boring and tedious, even though they had been done by older kids and adults without a problem. ‘Gamifying’ the tasks by adding a points-based reward structure along with physical prizes encouraged the kids to enthusiastically complete the activities. We recommend considering such gamification components for empirical studies with this age group. You can read the abstract below. For more details, see the paper. Come check out our poster if you’ll be at the conference!
In this paper, we describe the challenges we encountered and solutions we developed while collecting mobile touch and gesture interaction data in laboratory conditions from children ages 5 to 7 years old. We identify several challenges of conducting empirical studies with young children, including study length, motivation, and environment. We then propose and validate techniques for designing study protocols for this age group, focusing on the use of gamification components to better engage children in laboratory studies. The use of gamification increased our study task completion rates from 73% to 97%. This research contributes a better understanding of how to design study protocols for young children when lab studies are needed or preferred. Research with younger age groups alongside older children, adults, and special populations can lead to more sound guidelines for universal usability of mobile applications.
The $-family of recognizers isn’t just about how to build better recognition algorithms, it’s also about understanding patterns and inconsistencies in how people make gestures, too. This kind of knowledge will help inform gesture interaction both in terms of developing better recognizers, and designing appropriate gesture sets. In this vein, I have had a paper accepted, along with my collaborators, Jacob O. Wobbrock and Radu-Daniel Vatavu, to the Graphics Interface 2013 conference, on characterizing patterns in people’s execution of surface gestures from existing datasets. The paper is titled “Understanding the Consistency of Users’ Pen and Finger Stroke Gesture Articulation,” and here is the abstract:
Little work has been done on understanding the articulation patterns of users’ touch and surface gestures, despite the importance of such knowledge to inform the design of gesture recognizers and gesture sets for different applications. We report a methodology to analyze user consistency in gesture production, both between-users and within-user, by employing articulation features such as stroke type, stroke direction, and stroke ordering, and by measuring variations in execution with geometric and kinematic gesture descriptors. We report results on four gesture datasets (40,305 samples of 63 gesture types by 113 users). We find a high degree of consistency within-users (.91), lower consistency between-users (.55), higher consistency for certain gestures (e.g., less geometrically complex shapes are more consistent than complex ones), and a loglinear relationship between number of strokes and consistency. We highlight implications of our results to help designers create better surface gesture interfaces informed by user behavior.
As usual, you may download the camera-ready version of our paper if you are interested. See you in Regina!
February was a great month over here with lots of good news coming in about conference and journal paper acceptances! The MTAGIC project will be well-represented at the upcoming CHI 2013 conference, with two workshop papers accepted on different aspects of the project (the workshops are RepliCHI and Mobile Accessibility). We’ve also heard great news that two papers about our work with kids and mobile touchscreen devices will appear at IDC 2013 and in an upcoming special issue of the Journal of Personal and Ubiquitous Computing!
In other news, my project with Jacob O. Wobbrock and Radu-Daniel Vatavu on using patterns in how people make surface gestures to inform the design of better gesture sets and gesture recognizers (e.g., the $-family of recognizers) will appear at GI 2013. And, last but not least, my side project with Leah Findlater on understanding how people with physical impairments, including children, are using mainstream mobile touchscreen devices in their daily lives will receive a ‘Best Paper Award’ at CHI 2013! This award is an honor only the top 1% of submissions receive, and we are very honored our work was selected to be among such great company.
Look for more details on each of these upcoming papers in blog posts throughout March and April, and you can already see them listed in my current CV if you are interested.