The CHI 2013 conference was two weeks ago, and I presented a paper on work I did with a University of Maryland colleague, Leah Findlater, called “Analyzing User-Generated YouTube Videos to Understand Touchscreen Use by People with Motor Impairments.” We looked at YouTube as a source of data on users with physical disabilities telling their own stories about how they use touchscreen devices like tablets and smartphones in their daily lives. We also received a ‘Best Paper Award’ for this work! If you’re interested, you can find my presentation slides here.
Tag Archives: touch interaction
If you are in Paris at CHI 2013 this week, come see me present our ‘Best Paper Award’ winner “Analyzing User-Generated YouTube Videos to Understand Touchscreen Use by People with Motor Impairments” ! The talk will be on Tuesday, April 30th, from 11:20am to 11:40am in the Impairment and Rehabilitation session in room 242AB. Watch our video preview (narrated by my talented co-author Leah Findlater!) for a sneak peek:
An upcoming special issue of the Springer Journal of Personal and Ubiquitous Computing (JPUC) on Educational Interfaces, Software, and Technology (EIST) will include an article on the MTAGIC project! This article, entitled “Designing Smarter Touch-Based Interfaces for Educational Contexts,” is an extension of our CHI 2012 EIST workshop paper. We report our foundational studies investigating children’s touch and gesture input patterns, and how they differ from adults, with some discussion of how these findings will impact the design and development of educational apps for touchscreen devices. Here is the abstract:
In next-generation classrooms and educational environments, interactive technologies such as surface computing, natural gesture interfaces, and mobile devices will enable new means of motivating and engaging students in active learning. Our foundational studies provide a corpus of over 10,000 touch interactions and nearly 7,000 gestures collected from nearly 70 adults and children ages 7 years old and up, that can help us understand the characteristics of children’s interactions in these modalities and how they differ from adults. Based on these data, we identify key design and implementation challenges of supporting children’s touch and gesture interactions, and we suggest ways to address them. For example, we find children have more trouble successfully acquiring onscreen targets and having their gestures recognized than do adults, especially the youngest age group (7 to 10 years old). The contributions of this work provide a foundation that enables touch-based interactive educational apps that increase student success.
I’ll add a post when this special issue is officially published. For now, if you’re interested, you can check out the camera-ready version.
My NSF-funded MTAGIC project will appear at two workshops at the upcoming CHI 2013 conference in Paris, France! The first is the RepliCHI workshop, which is focusing on what role replication studies can play in the HCI literature. In our paper “Challenges of Replicating Empirical Studies with Children in HCI,” we are presenting a series of empirical studies that we have run with different age groups over the past 18 months, essentially replicating a similar methodology. We will specifically describe how our methodology has had to be adapted to work with very young children. This will be a two-day workshop. For more information, check out the camera-ready version of our paper; here is the abstract for a quick overview:
In this paper, we discuss the challenges of conducting a direct replication of a series of mobile device usability studies that were originally conducted with adults and older children (ages 7 to 17). The original studies were designed to investigate differences in how adults and children use mobile devices to touch targets and create surface gestures. In this paper, we report on a replication we conducted with young children (ages 5 to 7). We discuss several methodological changes that were needed to elicit the same quality of data from the replication with young children as had been obtained from the older children and adults. The insights we present are relevant to the extension of empirical studies in HCI in general to younger children.
The second workshop is the Mobile Accessibility workshop, which is focusing on how to improve the accessibility of mobile devices to users with different abilities and to users in different contexts. In our paper “Towards Designing Adaptive Touch-Based Interfaces,” we are presenting our vision of how the work we’ve been doing on MTAGIC will lead to universally accessible mobile touchscreen interaction, by highlighting some of the technical extensions we believe our work points to. Again, for more information, check out our camera-ready paper, and here is the abstract:
As the use of mobile devices by non-typical users increases, so does the need for platforms that can support the unique ways in which these special users engage with them. We posit that, by developing an understanding of patterns in input behaviors for different user groups, we can design and develop interactions that support such non-typical users. We prove this technique with children: we present findings from two empirical studies showing how interaction patterns differ among younger children, older children, and adults. These findings point to a model of how to develop touch-based interactive technologies that can adapt to users of different ages or abilities. Such adaptations will serve to better support natural interactions by user populations with distinctive needs.
If you work in the area of kids and touch + gesture interaction, or mobile device interaction in general, find a MTAGIC project member at CHI and say hi!
February was a great month over here with lots of good news coming in about conference and journal paper acceptances! The MTAGIC project will be well-represented at the upcoming CHI 2013 conference, with two workshop papers accepted on different aspects of the project (the workshops are RepliCHI and Mobile Accessibility). We’ve also heard great news that two papers about our work with kids and mobile touchscreen devices will appear at IDC 2013 and in an upcoming special issue of the Journal of Personal and Ubiquitous Computing!
In other news, my project with Jacob O. Wobbrock and Radu-Daniel Vatavu on using patterns in how people make surface gestures to inform the design of better gesture sets and gesture recognizers (e.g., the $-family of recognizers) will appear at GI 2013. And, last but not least, my side project with Leah Findlater on understanding how people with physical impairments, including children, are using mainstream mobile touchscreen devices in their daily lives will receive a ‘Best Paper Award’ at CHI 2013! This award is an honor only the top 1% of submissions receive, and we are very honored our work was selected to be among such great company.
Look for more details on each of these upcoming papers in blog posts throughout March and April, and you can already see them listed in my current CV if you are interested.