I’m pleased to announce that the MTAGIC project has had a poster accepted to the Interaction Design and Children (IDC) 2013 conference coming up this month in New York City! This poster paper was led by UMBC Human-Centered Computing (HCC) PhD student Robin Brewer. While doing an independent study with me, Robin investigated ways of motivating young children (ages 5 to 7 years old) to complete activities during empirical studies. Her initial explorations showed that this age group found the tasks boring and tedious, even though they had been done by older kids and adults without a problem. ‘Gamifying’ the tasks by adding a points-based reward structure along with physical prizes encouraged the kids to enthusiastically complete the activities. We recommend considering such gamification components for empirical studies with this age group. You can read the abstract below. For more details, see the paper. Come check out our poster if you’ll be at the conference!
In this paper, we describe the challenges we encountered and solutions we developed while collecting mobile touch and gesture interaction data in laboratory conditions from children ages 5 to 7 years old. We identify several challenges of conducting empirical studies with young children, including study length, motivation, and environment. We then propose and validate techniques for designing study protocols for this age group, focusing on the use of gamification components to better engage children in laboratory studies. The use of gamification increased our study task completion rates from 73% to 97%. This research contributes a better understanding of how to design study protocols for young children when lab studies are needed or preferred. Research with younger age groups alongside older children, adults, and special populations can lead to more sound guidelines for universal usability of mobile applications.
Over the past year on the MTAGIC project, we’ve been investigating differences in how children and adults make gestures and touch targets on mobile touchscreen devices. We have designed our study tasks to reflect the designs of existing apps on the market today, and have recently examined our data to understand the impact of visual feedback on gesture interaction for kids. Our paper on this topic, “Examining the Need for Visual Feedback during Gesture Interaction on Mobile Touchscreen Devices for Kids,” has been accepted to the Interaction Design and Children (IDC) conference! Read the abstract:
Surface gesture interaction styles used on modern mobile touchscreen devices are often dependent on the platform and application. Some applications show a visual trace of gesture input as it is made by the user, whereas others do not. Little work has been done examining the usability of visual feedback for surface gestures, especially for children. In this paper, we present results from an empirical study conducted with children, teens, and adults to explore characteristics of gesture interaction with and without visual feedback. We find that the gestures generated with and without visual feedback by users of different ages diverge significantly in ways that make them difficult to interpret. In addition, users prefer to see visual feedback. Based on these findings, we present several design recommendations for new surface gesture interfaces for children, teens, and adults on mobile touchscreen devices. In general, we recommend providing visual feedback, especially for children, wherever possible.
As usual, you can find the camera-ready version of this paper here. See you in New York City this June!
An upcoming special issue of the Springer Journal of Personal and Ubiquitous Computing (JPUC) on Educational Interfaces, Software, and Technology (EIST) will include an article on the MTAGIC project! This article, entitled “Designing Smarter Touch-Based Interfaces for Educational Contexts,” is an extension of our CHI 2012 EIST workshop paper. We report our foundational studies investigating children’s touch and gesture input patterns, and how they differ from adults, with some discussion of how these findings will impact the design and development of educational apps for touchscreen devices. Here is the abstract:
In next-generation classrooms and educational environments, interactive technologies such as surface computing, natural gesture interfaces, and mobile devices will enable new means of motivating and engaging students in active learning. Our foundational studies provide a corpus of over 10,000 touch interactions and nearly 7,000 gestures collected from nearly 70 adults and children ages 7 years old and up, that can help us understand the characteristics of children’s interactions in these modalities and how they differ from adults. Based on these data, we identify key design and implementation challenges of supporting children’s touch and gesture interactions, and we suggest ways to address them. For example, we find children have more trouble successfully acquiring onscreen targets and having their gestures recognized than do adults, especially the youngest age group (7 to 10 years old). The contributions of this work provide a foundation that enables touch-based interactive educational apps that increase student success.
I’ll add a post when this special issue is officially published. For now, if you’re interested, you can check out the camera-ready version.
The $-family of recognizers isn’t just about how to build better recognition algorithms, it’s also about understanding patterns and inconsistencies in how people make gestures, too. This kind of knowledge will help inform gesture interaction both in terms of developing better recognizers, and designing appropriate gesture sets. In this vein, I have had a paper accepted, along with my collaborators, Jacob O. Wobbrock and Radu-Daniel Vatavu, to the Graphics Interface 2013 conference, on characterizing patterns in people’s execution of surface gestures from existing datasets. The paper is titled “Understanding the Consistency of Users’ Pen and Finger Stroke Gesture Articulation,” and here is the abstract:
Little work has been done on understanding the articulation patterns of users’ touch and surface gestures, despite the importance of such knowledge to inform the design of gesture recognizers and gesture sets for different applications. We report a methodology to analyze user consistency in gesture production, both between-users and within-user, by employing articulation features such as stroke type, stroke direction, and stroke ordering, and by measuring variations in execution with geometric and kinematic gesture descriptors. We report results on four gesture datasets (40,305 samples of 63 gesture types by 113 users). We find a high degree of consistency within-users (.91), lower consistency between-users (.55), higher consistency for certain gestures (e.g., less geometrically complex shapes are more consistent than complex ones), and a loglinear relationship between number of strokes and consistency. We highlight implications of our results to help designers create better surface gesture interfaces informed by user behavior.
As usual, you may download the camera-ready version of our paper if you are interested. See you in Regina!
My NSF-funded MTAGIC project will appear at two workshops at the upcoming CHI 2013 conference in Paris, France! The first is the RepliCHI workshop, which is focusing on what role replication studies can play in the HCI literature. In our paper “Challenges of Replicating Empirical Studies with Children in HCI,” we are presenting a series of empirical studies that we have run with different age groups over the past 18 months, essentially replicating a similar methodology. We will specifically describe how our methodology has had to be adapted to work with very young children. This will be a two-day workshop. For more information, check out the camera-ready version of our paper; here is the abstract for a quick overview:
In this paper, we discuss the challenges of conducting a direct replication of a series of mobile device usability studies that were originally conducted with adults and older children (ages 7 to 17). The original studies were designed to investigate differences in how adults and children use mobile devices to touch targets and create surface gestures. In this paper, we report on a replication we conducted with young children (ages 5 to 7). We discuss several methodological changes that were needed to elicit the same quality of data from the replication with young children as had been obtained from the older children and adults. The insights we present are relevant to the extension of empirical studies in HCI in general to younger children.
The second workshop is the Mobile Accessibility workshop, which is focusing on how to improve the accessibility of mobile devices to users with different abilities and to users in different contexts. In our paper “Towards Designing Adaptive Touch-Based Interfaces,” we are presenting our vision of how the work we’ve been doing on MTAGIC will lead to universally accessible mobile touchscreen interaction, by highlighting some of the technical extensions we believe our work points to. Again, for more information, check out our camera-ready paper, and here is the abstract:
As the use of mobile devices by non-typical users increases, so does the need for platforms that can support the unique ways in which these special users engage with them. We posit that, by developing an understanding of patterns in input behaviors for different user groups, we can design and develop interactions that support such non-typical users. We prove this technique with children: we present findings from two empirical studies showing how interaction patterns differ among younger children, older children, and adults. These findings point to a model of how to develop touch-based interactive technologies that can adapt to users of different ages or abilities. Such adaptations will serve to better support natural interactions by user populations with distinctive needs.
If you work in the area of kids and touch + gesture interaction, or mobile device interaction in general, find a MTAGIC project member at CHI and say hi!